Apr 24 16:39:01.209424 ip-10-0-137-69 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:39:01.645446 ip-10-0-137-69 kubenswrapper[2563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:01.645446 ip-10-0-137-69 kubenswrapper[2563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:39:01.645446 ip-10-0-137-69 kubenswrapper[2563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:01.645446 ip-10-0-137-69 kubenswrapper[2563]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:39:01.645446 ip-10-0-137-69 kubenswrapper[2563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:01.646534 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.646132 2563 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:39:01.651843 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651828 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:01.651843 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651842 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651846 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651850 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651853 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651857 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651860 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651863 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651865 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651868 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651871 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651874 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651876 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651879 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651882 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651885 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651888 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651891 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651893 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651896 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651902 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:01.651912 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651905 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651908 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651910 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651913 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651916 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651920 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651924 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651927 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651930 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651932 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651935 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651937 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651940 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651943 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651945 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651948 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651951 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651953 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651956 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651958 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:01.652400 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651961 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651964 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651966 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651969 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651972 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651975 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651978 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651980 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651983 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651985 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651988 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651991 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651993 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651996 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.651999 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652002 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652005 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652007 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652012 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:01.652878 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652016 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652019 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652022 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652025 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652028 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652031 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652034 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652037 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652040 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652042 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652045 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652047 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652050 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652053 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652062 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652064 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652067 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652070 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652073 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652076 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:01.653385 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652079 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652082 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652084 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652087 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652090 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652092 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652487 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652493 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652496 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652499 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652502 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652505 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652507 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652510 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652512 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652515 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652518 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652520 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652523 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652525 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:01.653862 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652527 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652530 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652533 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652536 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652538 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652540 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652544 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652547 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652550 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652552 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652555 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652558 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652560 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652563 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652565 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652568 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652570 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652573 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652576 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652579 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:01.654359 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652581 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652583 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652586 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652588 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652591 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652593 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652596 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652599 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652601 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652604 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652607 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652609 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652612 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652614 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652617 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652619 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652622 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652624 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652626 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652630 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:01.654849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652632 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652635 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652638 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652641 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652643 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652646 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652649 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652651 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652654 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652656 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652660 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652663 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652665 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652669 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652672 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652675 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652677 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652680 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652683 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:01.655345 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652687 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652691 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652693 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652696 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652698 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652701 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652703 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652706 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652708 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652711 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652713 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652716 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.652718 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653847 2563 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653856 2563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653864 2563 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653869 2563 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653873 2563 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653877 2563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653881 2563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653892 2563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:39:01.655809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653896 2563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653899 2563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653904 2563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653907 2563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653911 2563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653913 2563 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653917 2563 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653920 2563 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653923 2563 flags.go:64] FLAG: --cloud-config="" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653926 2563 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653928 2563 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653932 2563 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653935 2563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653938 2563 flags.go:64] FLAG: --config-dir="" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653941 2563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653944 2563 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653948 2563 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653951 2563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653954 2563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653957 2563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653960 2563 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653963 2563 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653966 2563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653969 2563 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653972 2563 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:39:01.656326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653977 2563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653980 2563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653983 2563 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653986 2563 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653989 2563 flags.go:64] FLAG: --enable-server="true" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653992 2563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.653997 2563 flags.go:64] FLAG: --event-burst="100" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654000 2563 flags.go:64] FLAG: --event-qps="50" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654004 2563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654006 2563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654009 2563 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654013 2563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654017 2563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654019 2563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654022 2563 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654025 2563 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654028 2563 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654031 2563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654034 2563 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654037 2563 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654040 2563 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654043 2563 flags.go:64] FLAG: --feature-gates="" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654046 2563 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654049 2563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654052 2563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:39:01.656915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654055 2563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654058 2563 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654061 2563 flags.go:64] FLAG: --help="false" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654064 2563 flags.go:64] FLAG: --hostname-override="ip-10-0-137-69.ec2.internal" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654067 2563 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654071 2563 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654073 2563 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654077 2563 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654081 2563 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654084 2563 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654087 2563 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654089 2563 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654092 2563 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654095 2563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654098 2563 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654101 2563 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654103 2563 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654106 2563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654109 2563 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654112 2563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654115 2563 flags.go:64] FLAG: --lock-file="" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654118 2563 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654120 2563 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654123 2563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:39:01.657516 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654129 2563 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654142 2563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654145 2563 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654148 2563 flags.go:64] FLAG: --logging-format="text" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654151 2563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654154 2563 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654157 2563 flags.go:64] FLAG: --manifest-url="" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654160 2563 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654164 2563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654167 2563 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654171 2563 flags.go:64] FLAG: --max-pods="110" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654178 2563 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654181 2563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654184 2563 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654187 2563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654190 2563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654193 2563 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654198 2563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654205 2563 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654208 2563 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654211 2563 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654214 2563 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654217 2563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:39:01.658129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654223 2563 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654225 2563 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654228 2563 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654231 2563 flags.go:64] FLAG: --port="10250" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654234 2563 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654237 2563 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09c426fb5e3ff0461" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654240 2563 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654243 2563 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654246 2563 flags.go:64] FLAG: --register-node="true" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654249 2563 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654252 2563 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654256 2563 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654259 2563 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654262 2563 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654264 2563 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654271 2563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654274 2563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654277 2563 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654280 2563 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654283 2563 flags.go:64] FLAG: --runonce="false" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654286 2563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654290 2563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654293 2563 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654296 2563 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654299 2563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654302 2563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:39:01.658699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654306 2563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654311 2563 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654314 2563 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654317 2563 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654319 2563 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654322 2563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654325 2563 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654328 2563 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654331 2563 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654336 2563 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654339 2563 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654342 2563 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654346 2563 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654349 2563 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654352 2563 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654354 2563 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654357 2563 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654361 2563 flags.go:64] FLAG: --v="2" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654365 2563 flags.go:64] FLAG: --version="false" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654370 2563 flags.go:64] FLAG: --vmodule="" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654374 2563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654377 2563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654488 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654492 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:01.659408 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654494 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654497 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654500 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654504 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654507 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654510 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654512 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654515 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654519 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654522 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654525 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654528 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654531 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654533 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654536 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654539 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654541 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654544 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654547 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654549 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:01.659989 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654552 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654554 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654557 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654559 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654562 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654564 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654567 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654570 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654572 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654576 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654580 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654583 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654585 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654588 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654591 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654593 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654596 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654599 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654601 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654604 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:01.660494 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654606 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654609 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654612 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654615 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654617 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654620 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654623 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654625 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654628 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654630 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654633 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654635 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654638 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654641 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654643 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654645 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654650 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654653 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654656 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:01.660981 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654658 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654661 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654664 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654667 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654669 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654671 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654674 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654676 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654679 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654681 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654684 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654686 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654689 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654692 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654694 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654697 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654699 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654702 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654705 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654707 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:01.661460 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654710 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654712 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654715 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654717 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.654720 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.654726 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.661590 2563 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.661605 2563 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661654 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661659 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661662 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661665 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661668 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661671 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661674 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:01.661982 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661677 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661680 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661683 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661686 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661689 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661691 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661694 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661697 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661700 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661702 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661705 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661707 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661710 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661713 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661722 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661724 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661727 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661729 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661732 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661735 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:01.662383 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661737 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661740 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661744 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661748 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661751 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661754 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661756 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661759 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661761 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661764 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661766 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661769 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661772 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661774 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661777 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661780 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661782 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661785 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661787 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661790 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:01.662880 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661792 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661795 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661797 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661802 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661805 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661807 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661810 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661816 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661819 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661821 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661824 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661826 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661829 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661831 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661834 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661837 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661840 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661842 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661845 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:01.663384 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661849 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661853 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661856 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661858 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661861 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661863 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661865 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661868 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661870 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661873 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661875 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661878 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661880 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661882 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661891 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661895 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661897 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661900 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661903 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:01.663849 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.661905 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.661911 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662005 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662010 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662013 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662017 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662020 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662023 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662025 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662029 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662032 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662035 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662037 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662040 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662042 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:01.664397 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662045 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662048 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662050 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662053 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662056 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662058 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662061 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662063 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662066 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662069 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662071 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662074 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662076 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662079 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662081 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662084 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662086 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662088 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662091 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662093 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:01.664765 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662096 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662099 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662101 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662104 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662106 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662109 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662111 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662114 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662117 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662119 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662122 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662124 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662127 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662129 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662147 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662151 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662154 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662157 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662160 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662162 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662165 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:01.665260 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662168 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662170 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662173 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662175 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662177 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662180 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662182 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662185 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662188 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662192 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662195 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662198 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662201 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662204 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662207 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662210 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662213 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662215 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662218 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:01.665758 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662221 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662224 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662227 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662230 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662233 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662236 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662238 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662241 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662244 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662246 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662249 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662251 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:01.662254 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.662259 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.663055 2563 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:39:01.666231 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.665249 2563 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:39:01.666590 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.666176 2563 server.go:1019] "Starting client certificate rotation" Apr 24 16:39:01.666590 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.666268 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:01.666590 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.666303 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:01.688652 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.688634 2563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:01.691294 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.691255 2563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:01.705831 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.705810 2563 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:39:01.711620 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.711605 2563 log.go:25] "Validated CRI v1 image API" Apr 24 16:39:01.713700 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.713684 2563 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:39:01.717521 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.717503 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:01.717964 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.717944 2563 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9c36e63c-576b-4538-9d03-14f13ca6f81c:/dev/nvme0n1p3 f88651b3-67ca-418d-b4a7-8e25765636ef:/dev/nvme0n1p4] Apr 24 16:39:01.718002 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.717964 2563 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:39:01.723270 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.723166 2563 manager.go:217] Machine: {Timestamp:2026-04-24 16:39:01.722063052 +0000 UTC m=+0.392693945 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100052 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f985bc6c031af1834512275c41c7e SystemUUID:ec2f985b-c6c0-31af-1834-512275c41c7e BootID:cd78193b-def3-4363-9f93-8f45d1771f61 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:41:ce:e1:cf:e3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:41:ce:e1:cf:e3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8a:d3:3a:ae:f8:4e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:39:01.723270 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.723265 2563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:39:01.723384 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.723340 2563 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:39:01.724436 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.724412 2563 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:39:01.724584 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.724438 2563 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-69.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:39:01.724633 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.724594 2563 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:39:01.724633 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.724603 2563 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:39:01.724633 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.724615 2563 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:01.724633 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.724628 2563 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:01.725935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.725924 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:01.726230 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.726220 2563 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:39:01.728291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.728283 2563 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:39:01.728331 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.728295 2563 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:39:01.728978 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.728969 2563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:39:01.729016 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.728982 2563 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:39:01.729016 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.728990 2563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:39:01.730122 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.730109 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:01.730191 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.730127 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:01.733207 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.733192 2563 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:39:01.734451 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.734438 2563 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:39:01.736104 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736093 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:39:01.736163 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736110 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:39:01.736163 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736118 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:39:01.736163 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736123 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:39:01.736163 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736128 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:39:01.736163 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736148 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:39:01.736302 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736182 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:39:01.736302 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736188 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:39:01.736302 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736195 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:39:01.736302 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736201 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:39:01.736302 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736217 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:39:01.736302 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.736225 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:39:01.737812 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.737800 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:39:01.737862 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.737814 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:39:01.742306 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.742291 2563 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:39:01.742383 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.742337 2563 server.go:1295] "Started kubelet" Apr 24 16:39:01.742471 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.742434 2563 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:39:01.742994 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.742949 2563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:39:01.743086 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.743012 2563 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:39:01.743350 ip-10-0-137-69 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:39:01.743949 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.743595 2563 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-69.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:39:01.743949 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.743624 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:39:01.743949 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.743823 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-69.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:39:01.744467 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.744436 2563 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:39:01.745026 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.744994 2563 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:39:01.749739 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.749716 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:01.750490 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.750472 2563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:39:01.751247 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.751227 2563 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:39:01.751247 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.751247 2563 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:39:01.751407 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.751230 2563 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:39:01.751407 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.751362 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:01.751407 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.751381 2563 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:39:01.751407 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.751392 2563 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:39:01.752379 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.752361 2563 factory.go:55] Registering systemd factory Apr 24 16:39:01.752457 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.752388 2563 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:39:01.752674 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.752653 2563 factory.go:153] Registering CRI-O factory Apr 24 16:39:01.752674 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.752675 2563 factory.go:223] Registration of the crio container factory successfully Apr 24 16:39:01.752809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.752727 2563 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:39:01.752809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.752753 2563 factory.go:103] Registering Raw factory Apr 24 16:39:01.752809 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.752767 2563 manager.go:1196] Started watching for new ooms in manager Apr 24 16:39:01.753246 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.753231 2563 manager.go:319] Starting recovery of all containers Apr 24 16:39:01.753507 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.753335 2563 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-69.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 16:39:01.753982 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.753961 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 16:39:01.754061 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.753230 2563 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:39:01.754301 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.753432 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-69.ec2.internal.18a9586a581adf07 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-69.ec2.internal,UID:ip-10-0-137-69.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-69.ec2.internal,},FirstTimestamp:2026-04-24 16:39:01.742305031 +0000 UTC m=+0.412935926,LastTimestamp:2026-04-24 16:39:01.742305031 +0000 UTC m=+0.412935926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-69.ec2.internal,}" Apr 24 16:39:01.763335 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.763229 2563 manager.go:324] Recovery completed Apr 24 16:39:01.768722 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.768703 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:01.771105 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.771087 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:01.771190 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.771119 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:01.771190 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.771129 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:01.771677 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.771663 2563 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:39:01.771677 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.771675 2563 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:39:01.771783 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.771690 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:01.773490 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.773428 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-69.ec2.internal.18a9586a59d2529d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-69.ec2.internal,UID:ip-10-0-137-69.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-69.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-69.ec2.internal,},FirstTimestamp:2026-04-24 16:39:01.771104925 +0000 UTC m=+0.441735817,LastTimestamp:2026-04-24 16:39:01.771104925 +0000 UTC m=+0.441735817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-69.ec2.internal,}" Apr 24 16:39:01.773866 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.773850 2563 policy_none.go:49] "None policy: Start" Apr 24 16:39:01.773866 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.773865 2563 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:39:01.773980 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.773875 2563 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:39:01.777188 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.777171 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jmrqb" Apr 24 16:39:01.783766 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.783699 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-69.ec2.internal.18a9586a59d29d68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-69.ec2.internal,UID:ip-10-0-137-69.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-137-69.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-137-69.ec2.internal,},FirstTimestamp:2026-04-24 16:39:01.771124072 +0000 UTC m=+0.441754964,LastTimestamp:2026-04-24 16:39:01.771124072 +0000 UTC m=+0.441754964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-69.ec2.internal,}" Apr 24 16:39:01.785376 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.785360 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jmrqb" Apr 24 16:39:01.813730 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.813715 2563 manager.go:341] "Starting Device Plugin manager" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.813752 2563 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.813763 2563 server.go:85] "Starting device plugin registration server" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.814061 2563 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.814073 2563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.814231 2563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.814319 2563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.814330 2563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.814984 2563 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:39:01.827494 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.815014 2563 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:01.853078 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.853052 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:39:01.854174 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.854154 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:39:01.854266 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.854182 2563 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:39:01.854266 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.854200 2563 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:39:01.854266 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.854213 2563 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:39:01.854266 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.854247 2563 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:39:01.856335 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.856317 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:01.915118 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.915055 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:01.915842 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.915827 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:01.915911 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.915854 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:01.915911 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.915869 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:01.915911 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.915896 2563 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-69.ec2.internal" Apr 24 16:39:01.924923 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.924900 2563 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-69.ec2.internal" Apr 24 16:39:01.924923 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.924921 2563 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-69.ec2.internal\": node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:01.954690 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.954668 2563 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal"] Apr 24 16:39:01.954808 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.954729 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:01.955693 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.955679 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:01.956460 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.956448 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:01.956547 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.956471 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:01.956547 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.956481 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:01.957693 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.957681 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:01.958317 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.958301 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:01.958409 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.958327 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:01.958409 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.958341 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:01.958409 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.958376 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:01.958409 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.958401 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:01.958993 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.958982 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:01.959042 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.959004 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:01.959042 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.959014 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:01.959352 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.959340 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" Apr 24 16:39:01.959406 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.959364 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:01.959950 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.959936 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:01.960012 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.959964 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:01.960012 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:01.959979 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:01.981715 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.981699 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-69.ec2.internal\" not found" node="ip-10-0-137-69.ec2.internal" Apr 24 16:39:01.985961 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:01.985945 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-69.ec2.internal\" not found" node="ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.052880 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.052857 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f61c386af5d7ea584ac3696d9ab65309-config\") pod \"kube-apiserver-proxy-ip-10-0-137-69.ec2.internal\" (UID: \"f61c386af5d7ea584ac3696d9ab65309\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.052988 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.052884 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6abe1252542ce53f314b4a89b391fbf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal\" (UID: \"c6abe1252542ce53f314b4a89b391fbf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.052988 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.052909 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6abe1252542ce53f314b4a89b391fbf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal\" (UID: \"c6abe1252542ce53f314b4a89b391fbf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.055852 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.055838 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.153691 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.153668 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f61c386af5d7ea584ac3696d9ab65309-config\") pod \"kube-apiserver-proxy-ip-10-0-137-69.ec2.internal\" (UID: \"f61c386af5d7ea584ac3696d9ab65309\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.153774 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.153697 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6abe1252542ce53f314b4a89b391fbf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal\" (UID: \"c6abe1252542ce53f314b4a89b391fbf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.153774 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.153715 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6abe1252542ce53f314b4a89b391fbf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal\" (UID: \"c6abe1252542ce53f314b4a89b391fbf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.153774 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.153741 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6abe1252542ce53f314b4a89b391fbf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal\" (UID: \"c6abe1252542ce53f314b4a89b391fbf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.153774 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.153764 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6abe1252542ce53f314b4a89b391fbf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal\" (UID: \"c6abe1252542ce53f314b4a89b391fbf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.153923 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.153764 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f61c386af5d7ea584ac3696d9ab65309-config\") pod \"kube-apiserver-proxy-ip-10-0-137-69.ec2.internal\" (UID: \"f61c386af5d7ea584ac3696d9ab65309\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.156778 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.156763 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.257614 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.257576 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.283747 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.283729 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.288242 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.288225 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.358272 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.358247 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.458722 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.458695 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.559244 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.559186 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.659770 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.659744 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.665910 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.665891 2563 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:39:02.666058 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.666038 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:02.749966 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.749939 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:02.760660 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.760643 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.766553 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:02.766522 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6abe1252542ce53f314b4a89b391fbf.slice/crio-fd832818726a1431b161b289ecb78dd13d4a71ac9a671c0e5abcccf7df21f4a0 WatchSource:0}: Error finding container fd832818726a1431b161b289ecb78dd13d4a71ac9a671c0e5abcccf7df21f4a0: Status 404 returned error can't find the container with id fd832818726a1431b161b289ecb78dd13d4a71ac9a671c0e5abcccf7df21f4a0 Apr 24 16:39:02.767068 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:02.767050 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf61c386af5d7ea584ac3696d9ab65309.slice/crio-dc0aa572444b679e4a20ba3edfe47a40f88c74178c4780007f8e61d6dd388603 WatchSource:0}: Error finding container dc0aa572444b679e4a20ba3edfe47a40f88c74178c4780007f8e61d6dd388603: Status 404 returned error can't find the container with id dc0aa572444b679e4a20ba3edfe47a40f88c74178c4780007f8e61d6dd388603 Apr 24 16:39:02.768544 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.768524 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:02.772303 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.772290 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:39:02.787751 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.787723 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:34:01 +0000 UTC" deadline="2027-09-24 23:23:07.990593966 +0000 UTC" Apr 24 16:39:02.787751 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.787747 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12438h44m5.202849401s" Apr 24 16:39:02.856977 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.856894 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" event={"ID":"c6abe1252542ce53f314b4a89b391fbf","Type":"ContainerStarted","Data":"fd832818726a1431b161b289ecb78dd13d4a71ac9a671c0e5abcccf7df21f4a0"} Apr 24 16:39:02.857761 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.857737 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" event={"ID":"f61c386af5d7ea584ac3696d9ab65309","Type":"ContainerStarted","Data":"dc0aa572444b679e4a20ba3edfe47a40f88c74178c4780007f8e61d6dd388603"} Apr 24 16:39:02.860882 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:02.860865 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-69.ec2.internal\" not found" Apr 24 16:39:02.866144 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.866117 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:02.918500 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.918477 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jn45z" Apr 24 16:39:02.926349 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.926331 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jn45z" Apr 24 16:39:02.951444 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.951424 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.964177 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.964158 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:02.965779 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.965765 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" Apr 24 16:39:02.974624 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:02.974605 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:03.072353 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.072331 2563 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:03.263611 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.263584 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:03.730396 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.730366 2563 apiserver.go:52] "Watching apiserver" Apr 24 16:39:03.736912 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.736893 2563 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:39:03.739009 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.738982 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal","openshift-multus/multus-vzbf9","openshift-network-diagnostics/network-check-target-spk8g","openshift-network-operator/iptables-alerter-bfs48","kube-system/konnectivity-agent-xxmc2","kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574","openshift-dns/node-resolver-5dlzs","openshift-multus/multus-additional-cni-plugins-hhp56","openshift-multus/network-metrics-daemon-bzld5","openshift-ovn-kubernetes/ovnkube-node-dmn7d","openshift-cluster-node-tuning-operator/tuned-b2vpl","openshift-image-registry/node-ca-j59hh"] Apr 24 16:39:03.741171 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.741148 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:03.742539 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.742516 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.743586 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.743455 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5ml2p\"" Apr 24 16:39:03.743586 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.743532 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:39:03.743586 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.743523 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:39:03.743781 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.743767 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.744406 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.744386 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:39:03.744606 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.744588 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:39:03.744752 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.744737 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:39:03.744881 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.744859 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-42nv5\"" Apr 24 16:39:03.745376 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.745152 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:03.745376 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:03.745219 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:03.745837 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.745811 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-flcs7\"" Apr 24 16:39:03.746013 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.745998 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:03.747180 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.746653 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:03.747180 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.746710 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:39:03.747596 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.747579 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.749066 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.749048 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.749261 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.749242 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.749876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.749859 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:39:03.749972 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.749884 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:39:03.749972 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.749862 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qvpdb\"" Apr 24 16:39:03.750930 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.750694 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:03.750930 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:03.750772 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:03.751698 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.751678 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j8bfh\"" Apr 24 16:39:03.751922 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.751868 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:39:03.751922 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.751878 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:39:03.752078 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.752055 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:39:03.752172 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.752101 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:39:03.752437 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.752228 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rg4d4\"" Apr 24 16:39:03.752437 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.752245 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:39:03.752776 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.752746 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:39:03.756482 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.756462 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.756574 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.756514 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.757789 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.757770 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.758477 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.758459 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:03.758658 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.758640 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jbtpc\"" Apr 24 16:39:03.758744 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.758668 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:03.759072 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.759005 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9mjfw\"" Apr 24 16:39:03.759634 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.759616 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:39:03.759833 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.759812 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:39:03.759949 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.759931 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:39:03.761117 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.760319 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:39:03.761117 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.760381 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:39:03.761117 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.760544 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:39:03.761117 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.760668 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:39:03.761117 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.760676 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:39:03.761117 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.760710 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:39:03.761117 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.760941 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-phdpd\"" Apr 24 16:39:03.761956 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.761936 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.762054 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.761973 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-daemon-config\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.762054 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.761999 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.762054 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762023 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-run-netns\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.762242 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762072 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-ovnkube-config\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.762242 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762147 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-multus-certs\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.762242 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762187 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b514d98b-1969-443b-b7cf-8c931162148a-hosts-file\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.762242 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762212 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdvvc\" (UniqueName: \"kubernetes.io/projected/5e8ebda8-9661-4160-961c-49db3596480b-kube-api-access-xdvvc\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.762431 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762266 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55acf3aa-ca3c-4efa-893f-473501b43621-ovn-node-metrics-cert\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.762431 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762299 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnxp9\" (UniqueName: \"kubernetes.io/projected/fbb828be-4828-44c8-b5ec-6ffba7895c36-kube-api-access-cnxp9\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.762431 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762321 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-k8s-cni-cncf-io\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.762431 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762365 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.762431 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762382 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-device-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.762431 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762397 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-sys-fs\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.762703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762441 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.762703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762514 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.762703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762543 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:03.762703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762583 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e21e6557-bb10-4c31-b3ef-7bcfc18c9d27-konnectivity-ca\") pod \"konnectivity-agent-xxmc2\" (UID: \"e21e6557-bb10-4c31-b3ef-7bcfc18c9d27\") " pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:03.762703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762633 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-node-log\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.762703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762657 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-socket-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.762703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762681 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-system-cni-dir\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762702 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts95q\" (UniqueName: \"kubernetes.io/projected/b98926f9-5237-4269-877b-422b4e1c6edf-kube-api-access-ts95q\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762750 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-etc-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762790 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-var-lib-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762833 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-cni-bin\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762847 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-conf-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762864 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-os-release\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762906 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8ck\" (UniqueName: \"kubernetes.io/projected/36eba3f6-5e75-4e04-8052-6248d70f2dd3-kube-api-access-cq8ck\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762933 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-log-socket\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.762982 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-cni-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.763029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763005 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b514d98b-1969-443b-b7cf-8c931162148a-tmp-dir\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763042 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763076 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-hostroot\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763105 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-kubelet\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763169 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-env-overrides\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763225 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-etc-selinux\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763257 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-system-cni-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763315 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-netns\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763338 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-cni-bin\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763380 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-kubelet\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763405 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5drq\" (UniqueName: \"kubernetes.io/projected/71bf5297-d6b6-4b3b-a109-b99777f79b22-kube-api-access-q5drq\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763448 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e8ebda8-9661-4160-961c-49db3596480b-host-slash\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.763503 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763471 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-systemd\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763516 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-cni-netd\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763548 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-cni-multus\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763573 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrnc\" (UniqueName: \"kubernetes.io/projected/55acf3aa-ca3c-4efa-893f-473501b43621-kube-api-access-kwrnc\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763606 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e8ebda8-9661-4160-961c-49db3596480b-iptables-alerter-script\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763634 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-slash\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763660 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-registration-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763700 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763739 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763767 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71bf5297-d6b6-4b3b-a109-b99777f79b22-cni-binary-copy\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763791 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-socket-dir-parent\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763817 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e21e6557-bb10-4c31-b3ef-7bcfc18c9d27-agent-certs\") pod \"konnectivity-agent-xxmc2\" (UID: \"e21e6557-bb10-4c31-b3ef-7bcfc18c9d27\") " pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763841 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-os-release\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.763864 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-264pm\" (UniqueName: \"kubernetes.io/projected/b514d98b-1969-443b-b7cf-8c931162148a-kube-api-access-264pm\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.764030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.764020 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cnibin\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.764604 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.764057 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-ovn\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764604 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.764101 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-cnibin\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.764604 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.764185 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-ovnkube-script-lib\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764604 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.764207 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-systemd-units\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764604 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.764230 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-run-ovn-kubernetes\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.764604 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.764256 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-etc-kubernetes\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.852235 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.852210 2563 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:39:03.864668 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864646 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysctl-d\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.864811 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864678 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b514d98b-1969-443b-b7cf-8c931162148a-tmp-dir\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.864811 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864696 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.864811 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864712 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-hostroot\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.864811 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864757 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-hostroot\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.864811 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864791 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-kubelet\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.864811 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864808 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-env-overrides\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.865055 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864824 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-etc-selinux\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.865055 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864844 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-system-cni-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865055 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.864840 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-kubelet\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.865213 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865083 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-netns\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865261 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865231 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-cni-bin\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865314 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865271 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-kubelet\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865364 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865320 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5drq\" (UniqueName: \"kubernetes.io/projected/71bf5297-d6b6-4b3b-a109-b99777f79b22-kube-api-access-q5drq\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865364 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865354 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e8ebda8-9661-4160-961c-49db3596480b-host-slash\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.865461 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865384 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-systemd\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.865461 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865411 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-cni-netd\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.865461 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865416 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-etc-selinux\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.865461 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865446 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-run\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865488 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-kubelet\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865484 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-lib-modules\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865540 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-cni-multus\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865539 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865555 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-system-cni-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865560 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-cni-bin\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865572 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrnc\" (UniqueName: \"kubernetes.io/projected/55acf3aa-ca3c-4efa-893f-473501b43621-kube-api-access-kwrnc\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865429 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b514d98b-1969-443b-b7cf-8c931162148a-tmp-dir\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.865648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865644 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-systemd\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.866037 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865654 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-var-lib-cni-multus\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.866037 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865701 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e8ebda8-9661-4160-961c-49db3596480b-host-slash\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.866037 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865712 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-cni-netd\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.866037 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865713 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-netns\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.866037 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865853 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e8ebda8-9661-4160-961c-49db3596480b-iptables-alerter-script\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.866037 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.865899 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-slash\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867020 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-env-overrides\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867095 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-registration-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867123 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867174 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867203 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71bf5297-d6b6-4b3b-a109-b99777f79b22-cni-binary-copy\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867230 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-socket-dir-parent\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867264 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysconfig\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867321 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e21e6557-bb10-4c31-b3ef-7bcfc18c9d27-agent-certs\") pod \"konnectivity-agent-xxmc2\" (UID: \"e21e6557-bb10-4c31-b3ef-7bcfc18c9d27\") " pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867351 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-os-release\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867380 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-264pm\" (UniqueName: \"kubernetes.io/projected/b514d98b-1969-443b-b7cf-8c931162148a-kube-api-access-264pm\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867408 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cnibin\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867437 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-kubernetes\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867460 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vklb2\" (UniqueName: \"kubernetes.io/projected/c532fe1e-4595-4088-b40a-8ee0058e4ccd-kube-api-access-vklb2\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867491 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-ovn\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867536 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-cnibin\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867564 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-ovnkube-script-lib\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.869612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867594 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-systemd-units\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867624 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-run-ovn-kubernetes\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867649 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-etc-kubernetes\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867675 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867702 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-daemon-config\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867774 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867811 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysctl-conf\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867795 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-ovn\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867846 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c532fe1e-4595-4088-b40a-8ee0058e4ccd-host\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867880 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-run-netns\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867983 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-ovnkube-config\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.867999 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-run-ovn-kubernetes\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868000 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-etc-kubernetes\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868040 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-tuned\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868079 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-multus-certs\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868086 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-cnibin\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868113 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b514d98b-1969-443b-b7cf-8c931162148a-hosts-file\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868126 2563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:39:03.870641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868232 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-multus-certs\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868293 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b514d98b-1969-443b-b7cf-8c931162148a-hosts-file\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868344 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-registration-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868375 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cnibin\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868402 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-run-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868162 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvvc\" (UniqueName: \"kubernetes.io/projected/5e8ebda8-9661-4160-961c-49db3596480b-kube-api-access-xdvvc\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868586 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-daemon-config\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868587 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-socket-dir-parent\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868624 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55acf3aa-ca3c-4efa-893f-473501b43621-ovn-node-metrics-cert\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868657 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnxp9\" (UniqueName: \"kubernetes.io/projected/fbb828be-4828-44c8-b5ec-6ffba7895c36-kube-api-access-cnxp9\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868721 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-systemd\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868761 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjmrh\" (UniqueName: \"kubernetes.io/projected/ef5cd905-48f8-417e-9a87-328e3af64ce7-kube-api-access-mjmrh\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868794 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-k8s-cni-cncf-io\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868821 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868849 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-device-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868877 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-sys-fs\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868936 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-sys-fs\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.871523 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869009 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-host-run-k8s-cni-cncf-io\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869103 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-device-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.868934 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869182 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869211 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869235 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869265 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e21e6557-bb10-4c31-b3ef-7bcfc18c9d27-konnectivity-ca\") pod \"konnectivity-agent-xxmc2\" (UID: \"e21e6557-bb10-4c31-b3ef-7bcfc18c9d27\") " pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869282 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869291 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e8ebda8-9661-4160-961c-49db3596480b-iptables-alerter-script\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869293 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-node-log\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869732 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869977 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-ovnkube-config\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.869979 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870022 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/71bf5297-d6b6-4b3b-a109-b99777f79b22-cni-binary-copy\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870032 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-slash\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870087 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-socket-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870124 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-system-cni-dir\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.872381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870181 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts95q\" (UniqueName: \"kubernetes.io/projected/b98926f9-5237-4269-877b-422b4e1c6edf-kube-api-access-ts95q\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870209 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/36eba3f6-5e75-4e04-8052-6248d70f2dd3-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870218 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-modprobe-d\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870326 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fbb828be-4828-44c8-b5ec-6ffba7895c36-socket-dir\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870367 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-sys\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870401 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-host\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870403 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-node-log\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870449 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-run-netns\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870484 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-etc-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870485 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-system-cni-dir\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870486 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55acf3aa-ca3c-4efa-893f-473501b43621-ovnkube-script-lib\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870566 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-etc-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870625 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-var-lib-kubelet\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870630 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-systemd-units\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870668 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c532fe1e-4595-4088-b40a-8ee0058e4ccd-serviceca\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870707 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-var-lib-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870735 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-cni-bin\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.873291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870792 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-conf-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870840 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-os-release\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870872 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8ck\" (UniqueName: \"kubernetes.io/projected/36eba3f6-5e75-4e04-8052-6248d70f2dd3-kube-api-access-cq8ck\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870906 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef5cd905-48f8-417e-9a87-328e3af64ce7-tmp\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870935 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-log-socket\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870945 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e21e6557-bb10-4c31-b3ef-7bcfc18c9d27-konnectivity-ca\") pod \"konnectivity-agent-xxmc2\" (UID: \"e21e6557-bb10-4c31-b3ef-7bcfc18c9d27\") " pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:03.870990 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.870998 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-cni-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.871025 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/36eba3f6-5e75-4e04-8052-6248d70f2dd3-os-release\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:03.871083 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:04.371041326 +0000 UTC m=+3.041672206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.871117 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-cni-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.871194 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-log-socket\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.871245 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-host-cni-bin\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.871259 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55acf3aa-ca3c-4efa-893f-473501b43621-var-lib-openvswitch\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.871315 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-multus-conf-dir\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.871361 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/71bf5297-d6b6-4b3b-a109-b99777f79b22-os-release\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.872092 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55acf3aa-ca3c-4efa-893f-473501b43621-ovn-node-metrics-cert\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.874123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.872252 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e21e6557-bb10-4c31-b3ef-7bcfc18c9d27-agent-certs\") pod \"konnectivity-agent-xxmc2\" (UID: \"e21e6557-bb10-4c31-b3ef-7bcfc18c9d27\") " pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:03.875017 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:03.874505 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:03.875017 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:03.874524 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:03.875017 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:03.874537 2563 projected.go:194] Error preparing data for projected volume kube-api-access-k5bv7 for pod openshift-network-diagnostics/network-check-target-spk8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:03.875017 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:03.874597 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7 podName:1fb076e8-3881-4444-98b6-2d67d3820579 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:04.374580859 +0000 UTC m=+3.045211761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-k5bv7" (UniqueName: "kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7") pod "network-check-target-spk8g" (UID: "1fb076e8-3881-4444-98b6-2d67d3820579") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:03.875745 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.875526 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrnc\" (UniqueName: \"kubernetes.io/projected/55acf3aa-ca3c-4efa-893f-473501b43621-kube-api-access-kwrnc\") pod \"ovnkube-node-dmn7d\" (UID: \"55acf3aa-ca3c-4efa-893f-473501b43621\") " pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:03.875745 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.875702 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5drq\" (UniqueName: \"kubernetes.io/projected/71bf5297-d6b6-4b3b-a109-b99777f79b22-kube-api-access-q5drq\") pod \"multus-vzbf9\" (UID: \"71bf5297-d6b6-4b3b-a109-b99777f79b22\") " pod="openshift-multus/multus-vzbf9" Apr 24 16:39:03.876181 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.876159 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-264pm\" (UniqueName: \"kubernetes.io/projected/b514d98b-1969-443b-b7cf-8c931162148a-kube-api-access-264pm\") pod \"node-resolver-5dlzs\" (UID: \"b514d98b-1969-443b-b7cf-8c931162148a\") " pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:03.877149 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.877110 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdvvc\" (UniqueName: \"kubernetes.io/projected/5e8ebda8-9661-4160-961c-49db3596480b-kube-api-access-xdvvc\") pod \"iptables-alerter-bfs48\" (UID: \"5e8ebda8-9661-4160-961c-49db3596480b\") " pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:03.878059 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.878039 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnxp9\" (UniqueName: \"kubernetes.io/projected/fbb828be-4828-44c8-b5ec-6ffba7895c36-kube-api-access-cnxp9\") pod \"aws-ebs-csi-driver-node-5c574\" (UID: \"fbb828be-4828-44c8-b5ec-6ffba7895c36\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:03.880149 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.880120 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts95q\" (UniqueName: \"kubernetes.io/projected/b98926f9-5237-4269-877b-422b4e1c6edf-kube-api-access-ts95q\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:03.880335 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.880315 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8ck\" (UniqueName: \"kubernetes.io/projected/36eba3f6-5e75-4e04-8052-6248d70f2dd3-kube-api-access-cq8ck\") pod \"multus-additional-cni-plugins-hhp56\" (UID: \"36eba3f6-5e75-4e04-8052-6248d70f2dd3\") " pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:03.927901 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.927863 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:02 +0000 UTC" deadline="2027-10-14 17:02:57.521485944 +0000 UTC" Apr 24 16:39:03.927901 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.927896 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12912h23m53.593594679s" Apr 24 16:39:03.971887 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.971857 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysconfig\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972008 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.971898 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-kubernetes\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972008 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.971922 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vklb2\" (UniqueName: \"kubernetes.io/projected/c532fe1e-4595-4088-b40a-8ee0058e4ccd-kube-api-access-vklb2\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.972008 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.971950 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysctl-conf\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972008 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.971974 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c532fe1e-4595-4088-b40a-8ee0058e4ccd-host\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.972008 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.971991 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-kubernetes\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972008 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.971999 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-tuned\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972026 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-systemd\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972033 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c532fe1e-4595-4088-b40a-8ee0058e4ccd-host\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972051 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjmrh\" (UniqueName: \"kubernetes.io/projected/ef5cd905-48f8-417e-9a87-328e3af64ce7-kube-api-access-mjmrh\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972113 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-modprobe-d\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972121 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysctl-conf\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972153 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-sys\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972175 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-host\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972202 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-var-lib-kubelet\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972205 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-systemd\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972225 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c532fe1e-4595-4088-b40a-8ee0058e4ccd-serviceca\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.971992 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysconfig\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972301 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972252 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef5cd905-48f8-417e-9a87-328e3af64ce7-tmp\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972311 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysctl-d\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972349 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-run\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972371 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-lib-modules\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972513 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-lib-modules\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972616 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-run\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972643 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-sysctl-d\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972644 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-host\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972676 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-sys\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972691 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-modprobe-d\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.972884 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.972730 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef5cd905-48f8-417e-9a87-328e3af64ce7-var-lib-kubelet\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.973364 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.973075 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c532fe1e-4595-4088-b40a-8ee0058e4ccd-serviceca\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.974575 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.974554 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef5cd905-48f8-417e-9a87-328e3af64ce7-etc-tuned\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.974669 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.974587 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef5cd905-48f8-417e-9a87-328e3af64ce7-tmp\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:03.979937 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.979919 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vklb2\" (UniqueName: \"kubernetes.io/projected/c532fe1e-4595-4088-b40a-8ee0058e4ccd-kube-api-access-vklb2\") pod \"node-ca-j59hh\" (UID: \"c532fe1e-4595-4088-b40a-8ee0058e4ccd\") " pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:03.981602 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:03.981553 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjmrh\" (UniqueName: \"kubernetes.io/projected/ef5cd905-48f8-417e-9a87-328e3af64ce7-kube-api-access-mjmrh\") pod \"tuned-b2vpl\" (UID: \"ef5cd905-48f8-417e-9a87-328e3af64ce7\") " pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:04.033508 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.033484 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:04.056500 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.056479 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:04.064245 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.064227 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" Apr 24 16:39:04.073876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.073859 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bfs48" Apr 24 16:39:04.080205 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.080186 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5dlzs" Apr 24 16:39:04.087770 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.087754 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vzbf9" Apr 24 16:39:04.095274 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.095259 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hhp56" Apr 24 16:39:04.101793 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.101778 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" Apr 24 16:39:04.110351 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.110332 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:04.115821 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.115801 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j59hh" Apr 24 16:39:04.374944 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.374818 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:04.374944 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.374882 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:04.375106 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:04.374950 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:04.375106 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:04.374970 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:04.375106 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:04.374981 2563 projected.go:194] Error preparing data for projected volume kube-api-access-k5bv7 for pod openshift-network-diagnostics/network-check-target-spk8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:04.375106 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:04.375023 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:04.375106 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:04.375035 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7 podName:1fb076e8-3881-4444-98b6-2d67d3820579 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:05.375018878 +0000 UTC m=+4.045649758 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5bv7" (UniqueName: "kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7") pod "network-check-target-spk8g" (UID: "1fb076e8-3881-4444-98b6-2d67d3820579") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:04.375106 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:04.375079 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:05.375062616 +0000 UTC m=+4.045693511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:04.375497 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.375477 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef5cd905_48f8_417e_9a87_328e3af64ce7.slice/crio-d73246db4941ad18f1c04012be596d62072c6c92618861325de04ef957499662 WatchSource:0}: Error finding container d73246db4941ad18f1c04012be596d62072c6c92618861325de04ef957499662: Status 404 returned error can't find the container with id d73246db4941ad18f1c04012be596d62072c6c92618861325de04ef957499662 Apr 24 16:39:04.380332 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.380309 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb828be_4828_44c8_b5ec_6ffba7895c36.slice/crio-7053b59e69d0da97156053771af0cb6047ee70d64ecee4549099502274464d12 WatchSource:0}: Error finding container 7053b59e69d0da97156053771af0cb6047ee70d64ecee4549099502274464d12: Status 404 returned error can't find the container with id 7053b59e69d0da97156053771af0cb6047ee70d64ecee4549099502274464d12 Apr 24 16:39:04.381125 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.381097 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55acf3aa_ca3c_4efa_893f_473501b43621.slice/crio-e3091e3f62c7eb4b7f899192c8f8a9c699cc8f5c103d835fbd131cd7c756c5f6 WatchSource:0}: Error finding container e3091e3f62c7eb4b7f899192c8f8a9c699cc8f5c103d835fbd131cd7c756c5f6: Status 404 returned error can't find the container with id e3091e3f62c7eb4b7f899192c8f8a9c699cc8f5c103d835fbd131cd7c756c5f6 Apr 24 16:39:04.381970 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.381945 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36eba3f6_5e75_4e04_8052_6248d70f2dd3.slice/crio-f9fae2a8c89df73e340ccf80424b47cb68b709581ec5e6807354ab6863d4ee95 WatchSource:0}: Error finding container f9fae2a8c89df73e340ccf80424b47cb68b709581ec5e6807354ab6863d4ee95: Status 404 returned error can't find the container with id f9fae2a8c89df73e340ccf80424b47cb68b709581ec5e6807354ab6863d4ee95 Apr 24 16:39:04.383453 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.383390 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e8ebda8_9661_4160_961c_49db3596480b.slice/crio-c036da170ba0df1200493aeef4dc2372a8ae02c5ef99b40ec4ea5144ccfcaa8e WatchSource:0}: Error finding container c036da170ba0df1200493aeef4dc2372a8ae02c5ef99b40ec4ea5144ccfcaa8e: Status 404 returned error can't find the container with id c036da170ba0df1200493aeef4dc2372a8ae02c5ef99b40ec4ea5144ccfcaa8e Apr 24 16:39:04.384870 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.384846 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb514d98b_1969_443b_b7cf_8c931162148a.slice/crio-74c0cacfac4390d76a9f9d2ce936590022e1e15c0e6b93aa220e376bf76a8acd WatchSource:0}: Error finding container 74c0cacfac4390d76a9f9d2ce936590022e1e15c0e6b93aa220e376bf76a8acd: Status 404 returned error can't find the container with id 74c0cacfac4390d76a9f9d2ce936590022e1e15c0e6b93aa220e376bf76a8acd Apr 24 16:39:04.385922 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.385878 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode21e6557_bb10_4c31_b3ef_7bcfc18c9d27.slice/crio-f3a26887539a393aa9938ff913294b2820bf2c02d664a79745a0882827e3e6ea WatchSource:0}: Error finding container f3a26887539a393aa9938ff913294b2820bf2c02d664a79745a0882827e3e6ea: Status 404 returned error can't find the container with id f3a26887539a393aa9938ff913294b2820bf2c02d664a79745a0882827e3e6ea Apr 24 16:39:04.386166 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.386094 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bf5297_d6b6_4b3b_a109_b99777f79b22.slice/crio-bae672b15c083854d0e431120792357abf4a54b5f422d242cff654b83de12fed WatchSource:0}: Error finding container bae672b15c083854d0e431120792357abf4a54b5f422d242cff654b83de12fed: Status 404 returned error can't find the container with id bae672b15c083854d0e431120792357abf4a54b5f422d242cff654b83de12fed Apr 24 16:39:04.387040 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:04.387023 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc532fe1e_4595_4088_b40a_8ee0058e4ccd.slice/crio-5d1adec471bf107437276f7ccf89a38f786983668fdfbcaf946569ccafd25cb9 WatchSource:0}: Error finding container 5d1adec471bf107437276f7ccf89a38f786983668fdfbcaf946569ccafd25cb9: Status 404 returned error can't find the container with id 5d1adec471bf107437276f7ccf89a38f786983668fdfbcaf946569ccafd25cb9 Apr 24 16:39:04.873968 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.873662 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j59hh" event={"ID":"c532fe1e-4595-4088-b40a-8ee0058e4ccd","Type":"ContainerStarted","Data":"5d1adec471bf107437276f7ccf89a38f786983668fdfbcaf946569ccafd25cb9"} Apr 24 16:39:04.879768 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.879699 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzbf9" event={"ID":"71bf5297-d6b6-4b3b-a109-b99777f79b22","Type":"ContainerStarted","Data":"bae672b15c083854d0e431120792357abf4a54b5f422d242cff654b83de12fed"} Apr 24 16:39:04.885092 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.885026 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xxmc2" event={"ID":"e21e6557-bb10-4c31-b3ef-7bcfc18c9d27","Type":"ContainerStarted","Data":"f3a26887539a393aa9938ff913294b2820bf2c02d664a79745a0882827e3e6ea"} Apr 24 16:39:04.886406 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.886353 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerStarted","Data":"f9fae2a8c89df73e340ccf80424b47cb68b709581ec5e6807354ab6863d4ee95"} Apr 24 16:39:04.890816 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.890583 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"e3091e3f62c7eb4b7f899192c8f8a9c699cc8f5c103d835fbd131cd7c756c5f6"} Apr 24 16:39:04.896326 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.896260 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" event={"ID":"ef5cd905-48f8-417e-9a87-328e3af64ce7","Type":"ContainerStarted","Data":"d73246db4941ad18f1c04012be596d62072c6c92618861325de04ef957499662"} Apr 24 16:39:04.901688 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.901660 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" event={"ID":"f61c386af5d7ea584ac3696d9ab65309","Type":"ContainerStarted","Data":"a5a714bff8fc31a5f4080d5fab8ecd935e0bee77ffae16e5df4a751ff0af58fc"} Apr 24 16:39:04.904002 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.903977 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5dlzs" event={"ID":"b514d98b-1969-443b-b7cf-8c931162148a","Type":"ContainerStarted","Data":"74c0cacfac4390d76a9f9d2ce936590022e1e15c0e6b93aa220e376bf76a8acd"} Apr 24 16:39:04.906255 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.906124 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bfs48" event={"ID":"5e8ebda8-9661-4160-961c-49db3596480b","Type":"ContainerStarted","Data":"c036da170ba0df1200493aeef4dc2372a8ae02c5ef99b40ec4ea5144ccfcaa8e"} Apr 24 16:39:04.913208 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.913175 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" event={"ID":"fbb828be-4828-44c8-b5ec-6ffba7895c36","Type":"ContainerStarted","Data":"7053b59e69d0da97156053771af0cb6047ee70d64ecee4549099502274464d12"} Apr 24 16:39:04.928598 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.928565 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:02 +0000 UTC" deadline="2027-09-26 19:13:06.847874384 +0000 UTC" Apr 24 16:39:04.928598 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:04.928601 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12482h34m1.919278533s" Apr 24 16:39:05.387344 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:05.386377 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:05.387344 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:05.386444 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:05.387344 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:05.386564 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:05.387344 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:05.386621 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.386601387 +0000 UTC m=+6.057232272 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:05.387344 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:05.386993 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:05.387344 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:05.387011 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:05.387344 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:05.387024 2563 projected.go:194] Error preparing data for projected volume kube-api-access-k5bv7 for pod openshift-network-diagnostics/network-check-target-spk8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:05.387344 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:05.387065 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7 podName:1fb076e8-3881-4444-98b6-2d67d3820579 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.387050667 +0000 UTC m=+6.057681548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5bv7" (UniqueName: "kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7") pod "network-check-target-spk8g" (UID: "1fb076e8-3881-4444-98b6-2d67d3820579") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:05.856381 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:05.856274 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:05.856556 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:05.856430 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:05.856797 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:05.856651 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:05.856797 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:05.856750 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:05.936239 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:05.936198 2563 generic.go:358] "Generic (PLEG): container finished" podID="c6abe1252542ce53f314b4a89b391fbf" containerID="f6bfd823ef2614ff5e95f6f2da0da61377669cfa02b8f74b70b0e55ea05d59e6" exitCode=0 Apr 24 16:39:05.937078 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:05.937032 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" event={"ID":"c6abe1252542ce53f314b4a89b391fbf","Type":"ContainerDied","Data":"f6bfd823ef2614ff5e95f6f2da0da61377669cfa02b8f74b70b0e55ea05d59e6"} Apr 24 16:39:05.966986 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:05.966939 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-69.ec2.internal" podStartSLOduration=3.9669241939999997 podStartE2EDuration="3.966924194s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:04.918359537 +0000 UTC m=+3.588990441" watchObservedRunningTime="2026-04-24 16:39:05.966924194 +0000 UTC m=+4.637555097" Apr 24 16:39:06.955291 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:06.955183 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" event={"ID":"c6abe1252542ce53f314b4a89b391fbf","Type":"ContainerStarted","Data":"e5d7f4f8d9bf31e6d7550aef2df3b2f9950167acf8ea53e9987735e6d3bf66be"} Apr 24 16:39:07.403312 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:07.402368 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:07.403312 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:07.402454 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:07.403312 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:07.402585 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:07.403312 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:07.402649 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.402629784 +0000 UTC m=+10.073260665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:07.403312 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:07.403078 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:07.403312 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:07.403097 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:07.403312 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:07.403112 2563 projected.go:194] Error preparing data for projected volume kube-api-access-k5bv7 for pod openshift-network-diagnostics/network-check-target-spk8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:07.403312 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:07.403197 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7 podName:1fb076e8-3881-4444-98b6-2d67d3820579 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.40318034 +0000 UTC m=+10.073811221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5bv7" (UniqueName: "kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7") pod "network-check-target-spk8g" (UID: "1fb076e8-3881-4444-98b6-2d67d3820579") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:07.856281 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:07.856222 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:07.856281 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:07.856253 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:07.856527 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:07.856349 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:07.856527 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:07.856489 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:09.854733 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:09.854540 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:09.854733 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:09.854679 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:09.854733 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:09.854732 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:09.855324 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:09.854859 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:11.438588 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:11.438544 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:11.439073 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:11.438612 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:11.439073 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:11.438730 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:11.439073 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:11.438743 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:11.439073 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:11.438758 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:11.439073 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:11.438769 2563 projected.go:194] Error preparing data for projected volume kube-api-access-k5bv7 for pod openshift-network-diagnostics/network-check-target-spk8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:11.439073 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:11.438797 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:19.438778246 +0000 UTC m=+18.109409137 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:11.439073 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:11.438815 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7 podName:1fb076e8-3881-4444-98b6-2d67d3820579 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:19.438806196 +0000 UTC m=+18.109437076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5bv7" (UniqueName: "kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7") pod "network-check-target-spk8g" (UID: "1fb076e8-3881-4444-98b6-2d67d3820579") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:11.856320 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:11.856233 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:11.856465 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:11.856352 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:11.856716 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:11.856699 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:11.856807 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:11.856787 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:13.854417 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:13.854382 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:13.854417 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:13.854408 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:13.854899 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:13.854524 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:13.854899 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:13.854660 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:15.855241 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:15.855204 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:15.855661 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:15.855220 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:15.855661 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:15.855365 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:15.855661 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:15.855409 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:17.854605 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:17.854563 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:17.855024 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:17.854613 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:17.855024 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:17.854706 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:17.855024 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:17.854834 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:19.494958 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:19.494919 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:19.495535 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:19.494972 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:19.495535 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:19.495077 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:19.495535 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:19.495123 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:19.495535 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:19.495161 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:19.495535 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:19.495170 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:35.495153624 +0000 UTC m=+34.165784521 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:19.495535 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:19.495171 2563 projected.go:194] Error preparing data for projected volume kube-api-access-k5bv7 for pod openshift-network-diagnostics/network-check-target-spk8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:19.495535 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:19.495207 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7 podName:1fb076e8-3881-4444-98b6-2d67d3820579 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:35.495199483 +0000 UTC m=+34.165830362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5bv7" (UniqueName: "kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7") pod "network-check-target-spk8g" (UID: "1fb076e8-3881-4444-98b6-2d67d3820579") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:19.854935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:19.854862 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:19.855065 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:19.854862 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:19.855065 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:19.854956 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:19.855065 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:19.855034 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:20.116412 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.116317 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-69.ec2.internal" podStartSLOduration=18.116301509 podStartE2EDuration="18.116301509s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:06.972663081 +0000 UTC m=+5.643293985" watchObservedRunningTime="2026-04-24 16:39:20.116301509 +0000 UTC m=+18.786932410" Apr 24 16:39:20.116940 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.116924 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6swzl"] Apr 24 16:39:20.138680 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.138654 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.138823 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:20.138738 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:20.200652 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.200609 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.200652 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.200662 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c883a6e5-b74b-4d33-9372-a3da5fd267f6-dbus\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.200895 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.200737 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c883a6e5-b74b-4d33-9372-a3da5fd267f6-kubelet-config\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.301625 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.301589 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.301625 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.301621 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c883a6e5-b74b-4d33-9372-a3da5fd267f6-dbus\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.301861 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.301683 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c883a6e5-b74b-4d33-9372-a3da5fd267f6-kubelet-config\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.301861 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:20.301745 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:20.301861 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.301782 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c883a6e5-b74b-4d33-9372-a3da5fd267f6-kubelet-config\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.301861 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:20.301818 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret podName:c883a6e5-b74b-4d33-9372-a3da5fd267f6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:20.801798961 +0000 UTC m=+19.472429864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret") pod "global-pull-secret-syncer-6swzl" (UID: "c883a6e5-b74b-4d33-9372-a3da5fd267f6") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:20.302052 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.301868 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c883a6e5-b74b-4d33-9372-a3da5fd267f6-dbus\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.805697 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:20.805661 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:20.806094 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:20.805825 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:20.806094 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:20.805895 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret podName:c883a6e5-b74b-4d33-9372-a3da5fd267f6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:21.805873943 +0000 UTC m=+20.476504822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret") pod "global-pull-secret-syncer-6swzl" (UID: "c883a6e5-b74b-4d33-9372-a3da5fd267f6") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:21.812283 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.812105 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:21.812780 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:21.812260 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:21.812780 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:21.812349 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret podName:c883a6e5-b74b-4d33-9372-a3da5fd267f6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:23.812334265 +0000 UTC m=+22.482965145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret") pod "global-pull-secret-syncer-6swzl" (UID: "c883a6e5-b74b-4d33-9372-a3da5fd267f6") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:21.855614 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.855551 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:21.855737 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:21.855644 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:21.855737 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.855714 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:21.855851 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:21.855801 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:21.855851 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.855835 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:21.855948 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:21.855893 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:21.981627 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.981598 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j59hh" event={"ID":"c532fe1e-4595-4088-b40a-8ee0058e4ccd","Type":"ContainerStarted","Data":"623e40050f4ddf09fa0a13e5c16cfceb7a7830263aaa245b78750aa3a9ec0c32"} Apr 24 16:39:21.982898 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.982873 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzbf9" event={"ID":"71bf5297-d6b6-4b3b-a109-b99777f79b22","Type":"ContainerStarted","Data":"cc2276d3eded029eba1d4916ca8d7a1c8845293ca5e251c07e34b1235a583514"} Apr 24 16:39:21.984107 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.984081 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xxmc2" event={"ID":"e21e6557-bb10-4c31-b3ef-7bcfc18c9d27","Type":"ContainerStarted","Data":"25eb1372f937c4d8874eeb7667b2e6ce366d4dd7133140efe5f835d15a01dbc7"} Apr 24 16:39:21.985714 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.985690 2563 generic.go:358] "Generic (PLEG): container finished" podID="36eba3f6-5e75-4e04-8052-6248d70f2dd3" containerID="f85f4652c86637cecd0de7e55a8f78b1c12eefaa16403358b67f3f3366999072" exitCode=0 Apr 24 16:39:21.985859 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.985781 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerDied","Data":"f85f4652c86637cecd0de7e55a8f78b1c12eefaa16403358b67f3f3366999072"} Apr 24 16:39:21.988126 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.988095 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"ee02b819f2ab86cb1ad9f57f5278027a28fe2c128b2c70a3885fa254abc120c0"} Apr 24 16:39:21.988126 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.988118 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"84b02514c85c1f2563760b95c7591b20b7c6db6bfe01d069b727e885feb6d65f"} Apr 24 16:39:21.988274 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.988130 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"6dc27054e043d27b4b9c6e35a3b5c5a67f461aaa4f5dce88d924fdb8d090fb68"} Apr 24 16:39:21.989494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.989470 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" event={"ID":"ef5cd905-48f8-417e-9a87-328e3af64ce7","Type":"ContainerStarted","Data":"44e4954ad2d48db34a05abc1df95a3d925355f0c810ce4a0fe0ea77396df1a57"} Apr 24 16:39:21.993096 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.992993 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5dlzs" event={"ID":"b514d98b-1969-443b-b7cf-8c931162148a","Type":"ContainerStarted","Data":"f1d7b124adde3e68b8e830dd768a31ba603e1a11dc7f4be66fc074eef3e56424"} Apr 24 16:39:21.994324 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.994304 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" event={"ID":"fbb828be-4828-44c8-b5ec-6ffba7895c36","Type":"ContainerStarted","Data":"fb0d823752cd1eb7dff92397ace6489aebec73a78b6b871c54e4a73d58f64146"} Apr 24 16:39:21.995555 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:21.995520 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j59hh" podStartSLOduration=7.627064517 podStartE2EDuration="19.995510823s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.389084529 +0000 UTC m=+3.059715427" lastFinishedPulling="2026-04-24 16:39:16.75753085 +0000 UTC m=+15.428161733" observedRunningTime="2026-04-24 16:39:21.995339516 +0000 UTC m=+20.665970418" watchObservedRunningTime="2026-04-24 16:39:21.995510823 +0000 UTC m=+20.666141723" Apr 24 16:39:22.012184 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.012150 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b2vpl" podStartSLOduration=2.98998282 podStartE2EDuration="20.012122615s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.377970733 +0000 UTC m=+3.048601613" lastFinishedPulling="2026-04-24 16:39:21.40011053 +0000 UTC m=+20.070741408" observedRunningTime="2026-04-24 16:39:22.012051634 +0000 UTC m=+20.682682535" watchObservedRunningTime="2026-04-24 16:39:22.012122615 +0000 UTC m=+20.682753516" Apr 24 16:39:22.030044 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.030003 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xxmc2" podStartSLOduration=4.073473228 podStartE2EDuration="21.029993007s" podCreationTimestamp="2026-04-24 16:39:01 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.388867257 +0000 UTC m=+3.059498136" lastFinishedPulling="2026-04-24 16:39:21.345387031 +0000 UTC m=+20.016017915" observedRunningTime="2026-04-24 16:39:22.028696417 +0000 UTC m=+20.699327318" watchObservedRunningTime="2026-04-24 16:39:22.029993007 +0000 UTC m=+20.700623907" Apr 24 16:39:22.080602 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.080563 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5dlzs" podStartSLOduration=3.093926416 podStartE2EDuration="20.080546153s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.386642601 +0000 UTC m=+3.057273479" lastFinishedPulling="2026-04-24 16:39:21.373262332 +0000 UTC m=+20.043893216" observedRunningTime="2026-04-24 16:39:22.079938687 +0000 UTC m=+20.750569587" watchObservedRunningTime="2026-04-24 16:39:22.080546153 +0000 UTC m=+20.751177054" Apr 24 16:39:22.099458 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.099417 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vzbf9" podStartSLOduration=3.080707494 podStartE2EDuration="20.099403546s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.388210457 +0000 UTC m=+3.058841353" lastFinishedPulling="2026-04-24 16:39:21.406906511 +0000 UTC m=+20.077537405" observedRunningTime="2026-04-24 16:39:22.098808542 +0000 UTC m=+20.769439444" watchObservedRunningTime="2026-04-24 16:39:22.099403546 +0000 UTC m=+20.770034447" Apr 24 16:39:22.333718 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.333653 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:22.334704 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.334679 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:22.634441 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.634414 2563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:22.825710 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.825592 2563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:22.634436643Z","UUID":"c875715d-67a8-4c8c-8a6a-64c41e4d3b18","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:22.829969 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.829943 2563 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:22.829969 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:22.829971 2563 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:23.000433 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.000388 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"058f34dd100549d83c7ac1c1cf2210bd86a0ee95af4faebe4f931c7780baa402"} Apr 24 16:39:23.000433 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.000432 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"c863b5a889728a7c2e790015210f97b16181946d15dcd8bb8d5c655751d5b0df"} Apr 24 16:39:23.000646 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.000445 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"db8d29a5fc4cd81e44e09ba0b7611cf6e0e456a06969906746a1ec72ea2bd137"} Apr 24 16:39:23.002077 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.002048 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bfs48" event={"ID":"5e8ebda8-9661-4160-961c-49db3596480b","Type":"ContainerStarted","Data":"bc8080b33f2e685da668892ca6a9448b56338e655962557c5540359b2586840f"} Apr 24 16:39:23.004380 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.004355 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" event={"ID":"fbb828be-4828-44c8-b5ec-6ffba7895c36","Type":"ContainerStarted","Data":"7de763bef1a4652105718d4072383fc46c53bdd02cefb36c45c61ab7cb03a1b2"} Apr 24 16:39:23.005119 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.004985 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:23.005582 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.005564 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xxmc2" Apr 24 16:39:23.026835 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.026737 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bfs48" podStartSLOduration=4.339921089 podStartE2EDuration="21.026721662s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.385746739 +0000 UTC m=+3.056377619" lastFinishedPulling="2026-04-24 16:39:21.072547297 +0000 UTC m=+19.743178192" observedRunningTime="2026-04-24 16:39:23.025760506 +0000 UTC m=+21.696391446" watchObservedRunningTime="2026-04-24 16:39:23.026721662 +0000 UTC m=+21.697352564" Apr 24 16:39:23.828070 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.827988 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:23.828658 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:23.828176 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:23.828658 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:23.828254 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret podName:c883a6e5-b74b-4d33-9372-a3da5fd267f6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.828236764 +0000 UTC m=+26.498867643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret") pod "global-pull-secret-syncer-6swzl" (UID: "c883a6e5-b74b-4d33-9372-a3da5fd267f6") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:23.855404 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.855377 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:23.855567 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:23.855512 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:23.855971 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.855950 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:23.856082 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:23.856059 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:23.856156 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:23.856114 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:23.856229 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:23.856207 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:24.007910 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:24.007869 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" event={"ID":"fbb828be-4828-44c8-b5ec-6ffba7895c36","Type":"ContainerStarted","Data":"e1b791d5ac91a7d15930547480f5f867aef0b8e9425ad36f1349e30641144d31"} Apr 24 16:39:24.028979 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:24.028921 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5c574" podStartSLOduration=3.915821721 podStartE2EDuration="23.028905507s" podCreationTimestamp="2026-04-24 16:39:01 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.381911627 +0000 UTC m=+3.052542505" lastFinishedPulling="2026-04-24 16:39:23.49499541 +0000 UTC m=+22.165626291" observedRunningTime="2026-04-24 16:39:24.028426349 +0000 UTC m=+22.699057252" watchObservedRunningTime="2026-04-24 16:39:24.028905507 +0000 UTC m=+22.699536412" Apr 24 16:39:25.013641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:25.013397 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"1ca14ece5ae490494c7c76cfbf664ed4615ebbfad148fc90db6a0fdd09b814fb"} Apr 24 16:39:25.854910 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:25.854882 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:25.855088 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:25.854887 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:25.855088 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:25.854972 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:25.855088 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:25.855059 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:25.855088 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:25.854887 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:25.855254 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:25.855155 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:27.018945 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.018701 2563 generic.go:358] "Generic (PLEG): container finished" podID="36eba3f6-5e75-4e04-8052-6248d70f2dd3" containerID="6cdb428f3d4ba26852c9feac32b6341a61ae3b56240e8c2fed9eb2696658d2cb" exitCode=0 Apr 24 16:39:27.019670 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.018775 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerDied","Data":"6cdb428f3d4ba26852c9feac32b6341a61ae3b56240e8c2fed9eb2696658d2cb"} Apr 24 16:39:27.022281 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.022257 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" event={"ID":"55acf3aa-ca3c-4efa-893f-473501b43621","Type":"ContainerStarted","Data":"5beae1c16fb4d630b50eb3534009f8716f06f02b434652c3d11df38ff2612491"} Apr 24 16:39:27.022593 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.022571 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:27.022693 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.022601 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:27.036338 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.036317 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:27.068074 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.068031 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" podStartSLOduration=7.71838321 podStartE2EDuration="25.068019829s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.383199903 +0000 UTC m=+3.053830784" lastFinishedPulling="2026-04-24 16:39:21.732836494 +0000 UTC m=+20.403467403" observedRunningTime="2026-04-24 16:39:27.06703233 +0000 UTC m=+25.737663254" watchObservedRunningTime="2026-04-24 16:39:27.068019829 +0000 UTC m=+25.738650730" Apr 24 16:39:27.855344 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.855319 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:27.855447 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.855358 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:27.855447 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.855435 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:27.855566 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:27.855463 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:27.855566 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:27.855470 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:27.855566 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:27.855507 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret podName:c883a6e5-b74b-4d33-9372-a3da5fd267f6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:35.85548975 +0000 UTC m=+34.526120637 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret") pod "global-pull-secret-syncer-6swzl" (UID: "c883a6e5-b74b-4d33-9372-a3da5fd267f6") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:27.855566 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:27.855548 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:27.855703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:27.855599 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:27.855703 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:27.855689 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:28.025962 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.025922 2563 generic.go:358] "Generic (PLEG): container finished" podID="36eba3f6-5e75-4e04-8052-6248d70f2dd3" containerID="efac6ff0a48ba266c8eb68a3dbf0c672ef6d9889a8ab7a6e640d3137a014c99a" exitCode=0 Apr 24 16:39:28.026365 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.025993 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerDied","Data":"efac6ff0a48ba266c8eb68a3dbf0c672ef6d9889a8ab7a6e640d3137a014c99a"} Apr 24 16:39:28.026720 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.026706 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:28.040096 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.040072 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:39:28.204945 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.204916 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bzld5"] Apr 24 16:39:28.205100 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.205031 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:28.205189 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:28.205114 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:28.208089 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.208061 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6swzl"] Apr 24 16:39:28.208215 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.208170 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:28.208272 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:28.208255 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:28.208792 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.208760 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-spk8g"] Apr 24 16:39:28.208882 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:28.208858 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:28.208964 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:28.208943 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:29.029686 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:29.029603 2563 generic.go:358] "Generic (PLEG): container finished" podID="36eba3f6-5e75-4e04-8052-6248d70f2dd3" containerID="43c1daf7c3d1aef42d0443f791d85e4336e292af40d8191b649a9cb300e1f874" exitCode=0 Apr 24 16:39:29.030016 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:29.029691 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerDied","Data":"43c1daf7c3d1aef42d0443f791d85e4336e292af40d8191b649a9cb300e1f874"} Apr 24 16:39:29.854434 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:29.854404 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:29.854590 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:29.854405 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:29.854590 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:29.854535 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:29.854683 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:29.854609 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:29.854683 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:29.854404 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:29.854747 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:29.854704 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:31.855445 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:31.855265 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:31.855894 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:31.855355 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:31.855894 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:31.855533 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:31.855894 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:31.855384 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:31.855894 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:31.855624 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:31.855894 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:31.855743 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:33.855153 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:33.855100 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:33.855708 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:33.855109 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:33.855708 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:33.855254 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:39:33.855708 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:33.855109 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:33.855708 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:33.855348 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spk8g" podUID="1fb076e8-3881-4444-98b6-2d67d3820579" Apr 24 16:39:33.855708 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:33.855450 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6swzl" podUID="c883a6e5-b74b-4d33-9372-a3da5fd267f6" Apr 24 16:39:34.162192 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.162107 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-69.ec2.internal" event="NodeReady" Apr 24 16:39:34.162352 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.162256 2563 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:34.196562 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.196532 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7f96c945bd-7bmhc"] Apr 24 16:39:34.228791 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.228766 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f96c945bd-7bmhc"] Apr 24 16:39:34.228791 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.228792 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vwpz2"] Apr 24 16:39:34.229002 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.228928 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.231015 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.230991 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 16:39:34.231186 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.231028 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 16:39:34.231505 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.231489 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-t5hh5\"" Apr 24 16:39:34.231763 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.231746 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 16:39:34.238222 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.238204 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 16:39:34.243578 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.243499 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rcmmv"] Apr 24 16:39:34.243661 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.243612 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.249443 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.249394 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ljxll\"" Apr 24 16:39:34.249689 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.249674 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:34.250458 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.250071 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:34.261782 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.261761 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vwpz2"] Apr 24 16:39:34.261782 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.261782 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rcmmv"] Apr 24 16:39:34.261905 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.261868 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:34.263913 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.263894 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:34.263913 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.263905 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2lclw\"" Apr 24 16:39:34.264282 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.264259 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:34.264374 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.264263 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:34.306346 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.306324 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czt5x\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-kube-api-access-czt5x\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.306446 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.306374 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-image-registry-private-configuration\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.306446 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.306398 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.306446 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.306423 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-certificates\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.306446 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.306445 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-trusted-ca\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.306609 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.306486 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-bound-sa-token\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.306609 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.306574 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-installation-pull-secrets\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.306669 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.306617 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2af4c218-5178-4a23-b52e-7e4972f6785c-ca-trust-extracted\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.407699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407672 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-image-registry-private-configuration\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.407699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407705 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407724 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-certificates\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407740 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-trusted-ca\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407775 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7w6\" (UniqueName: \"kubernetes.io/projected/5f5bde11-c32a-402f-994e-143e03a8dd70-kube-api-access-zm7w6\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407797 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-bound-sa-token\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407815 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.407832 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.407854 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407868 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/086b1d61-526e-4540-a577-70f6d4cc1109-config-volume\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.407905 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-installation-pull-secrets\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.407935 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.407939 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:34.907913257 +0000 UTC m=+33.578544155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:39:34.408425 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.408019 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2af4c218-5178-4a23-b52e-7e4972f6785c-ca-trust-extracted\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.408425 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.408056 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvxp\" (UniqueName: \"kubernetes.io/projected/086b1d61-526e-4540-a577-70f6d4cc1109-kube-api-access-vgvxp\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.408425 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.408096 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czt5x\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-kube-api-access-czt5x\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.408425 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.408127 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/086b1d61-526e-4540-a577-70f6d4cc1109-tmp-dir\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.408425 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.408198 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:34.408752 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.408732 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-certificates\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.408841 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.408821 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2af4c218-5178-4a23-b52e-7e4972f6785c-ca-trust-extracted\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.412316 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.409633 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-trusted-ca\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.413374 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.413347 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-image-registry-private-configuration\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.413980 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.413963 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-installation-pull-secrets\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.418014 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.417995 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-bound-sa-token\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.421641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.421614 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czt5x\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-kube-api-access-czt5x\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.508729 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.508699 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:34.508867 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.508771 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7w6\" (UniqueName: \"kubernetes.io/projected/5f5bde11-c32a-402f-994e-143e03a8dd70-kube-api-access-zm7w6\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:34.508867 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.508805 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.508867 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.508833 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/086b1d61-526e-4540-a577-70f6d4cc1109-config-volume\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.509036 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.508864 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:34.509036 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.508918 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:35.008903853 +0000 UTC m=+33.679534731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:39:34.509036 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.508931 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:34.509036 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.508994 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:35.008973726 +0000 UTC m=+33.679604624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:39:34.509036 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.509015 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvxp\" (UniqueName: \"kubernetes.io/projected/086b1d61-526e-4540-a577-70f6d4cc1109-kube-api-access-vgvxp\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.509327 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.509042 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/086b1d61-526e-4540-a577-70f6d4cc1109-tmp-dir\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.509327 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.509305 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/086b1d61-526e-4540-a577-70f6d4cc1109-tmp-dir\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.509430 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.509399 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/086b1d61-526e-4540-a577-70f6d4cc1109-config-volume\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.523412 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.523391 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvxp\" (UniqueName: \"kubernetes.io/projected/086b1d61-526e-4540-a577-70f6d4cc1109-kube-api-access-vgvxp\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:34.523501 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.523435 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7w6\" (UniqueName: \"kubernetes.io/projected/5f5bde11-c32a-402f-994e-143e03a8dd70-kube-api-access-zm7w6\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:34.911818 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:34.911795 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:34.912348 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.911939 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:34.912348 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.911955 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:39:34.912348 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:34.912008 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:35.911992419 +0000 UTC m=+34.582623298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:39:35.012528 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.012496 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:35.012707 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.012568 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:35.012707 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.012660 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:35.012707 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.012670 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:35.012814 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.012716 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:36.012701322 +0000 UTC m=+34.683332201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:39:35.012814 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.012731 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:36.012723483 +0000 UTC m=+34.683354363 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:39:35.044279 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.044206 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerStarted","Data":"d50551558673896caecdb250ad125ac10add08e8e1ba7ce26f8dcf89fc0befb1"} Apr 24 16:39:35.517964 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.517931 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:35.518161 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.518030 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:35.518161 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.518099 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:35.518161 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.518125 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:35.518161 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.518156 2563 projected.go:194] Error preparing data for projected volume kube-api-access-k5bv7 for pod openshift-network-diagnostics/network-check-target-spk8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:35.518287 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.518175 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:35.518287 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.518218 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:40:07.518201304 +0000 UTC m=+66.188832190 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:35.518287 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.518233 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7 podName:1fb076e8-3881-4444-98b6-2d67d3820579 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:07.518225658 +0000 UTC m=+66.188856540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5bv7" (UniqueName: "kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7") pod "network-check-target-spk8g" (UID: "1fb076e8-3881-4444-98b6-2d67d3820579") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:35.857409 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.857341 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:35.857571 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.857342 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:39:35.857636 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.857342 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:39:35.861387 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.861361 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-czswq\"" Apr 24 16:39:35.861387 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.861382 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:35.861563 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.861361 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7ckwm\"" Apr 24 16:39:35.861632 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.861584 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:39:35.861675 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.861650 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:35.861675 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.861661 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:35.920868 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.920848 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:35.921120 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.920909 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:35.921120 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.921054 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:35.921120 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.921070 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:39:35.921120 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:35.921117 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:37.921103279 +0000 UTC m=+36.591734163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:39:35.923044 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:35.923021 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c883a6e5-b74b-4d33-9372-a3da5fd267f6-original-pull-secret\") pod \"global-pull-secret-syncer-6swzl\" (UID: \"c883a6e5-b74b-4d33-9372-a3da5fd267f6\") " pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:36.022008 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:36.021978 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:36.022090 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:36.022049 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:36.022177 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:36.022163 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:36.022230 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:36.022179 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:36.022278 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:36.022235 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:38.022215372 +0000 UTC m=+36.692846254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:39:36.022278 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:36.022255 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:38.022245521 +0000 UTC m=+36.692876400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:39:36.048347 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:36.048325 2563 generic.go:358] "Generic (PLEG): container finished" podID="36eba3f6-5e75-4e04-8052-6248d70f2dd3" containerID="d50551558673896caecdb250ad125ac10add08e8e1ba7ce26f8dcf89fc0befb1" exitCode=0 Apr 24 16:39:36.048453 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:36.048368 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerDied","Data":"d50551558673896caecdb250ad125ac10add08e8e1ba7ce26f8dcf89fc0befb1"} Apr 24 16:39:36.166832 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:36.166813 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6swzl" Apr 24 16:39:36.333447 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:36.333253 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6swzl"] Apr 24 16:39:36.337041 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:36.337018 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc883a6e5_b74b_4d33_9372_a3da5fd267f6.slice/crio-9372a4476b3f7051635beadbddc72abd5bbd5d98da1a3b652bf20557b7772655 WatchSource:0}: Error finding container 9372a4476b3f7051635beadbddc72abd5bbd5d98da1a3b652bf20557b7772655: Status 404 returned error can't find the container with id 9372a4476b3f7051635beadbddc72abd5bbd5d98da1a3b652bf20557b7772655 Apr 24 16:39:37.052156 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:37.052104 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6swzl" event={"ID":"c883a6e5-b74b-4d33-9372-a3da5fd267f6","Type":"ContainerStarted","Data":"9372a4476b3f7051635beadbddc72abd5bbd5d98da1a3b652bf20557b7772655"} Apr 24 16:39:37.055012 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:37.054986 2563 generic.go:358] "Generic (PLEG): container finished" podID="36eba3f6-5e75-4e04-8052-6248d70f2dd3" containerID="24d40629fe1ef2405c546efe0a55471d265f7efb6708af3ba2f9cf2e698672c5" exitCode=0 Apr 24 16:39:37.055120 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:37.055034 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerDied","Data":"24d40629fe1ef2405c546efe0a55471d265f7efb6708af3ba2f9cf2e698672c5"} Apr 24 16:39:37.937979 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:37.937737 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:37.938149 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:37.937904 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:37.938149 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:37.938055 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:39:37.938149 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:37.938118 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.938099365 +0000 UTC m=+40.608730246 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:39:38.039487 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:38.039403 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:38.039619 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:38.039504 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:38.039688 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:38.039618 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:38.039688 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:38.039656 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:38.039789 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:38.039700 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.039676119 +0000 UTC m=+40.710307001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:39:38.039789 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:38.039722 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.039712443 +0000 UTC m=+40.710343328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:39:38.060515 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:38.060485 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhp56" event={"ID":"36eba3f6-5e75-4e04-8052-6248d70f2dd3","Type":"ContainerStarted","Data":"3692880202e8be5884341c0907db951b79018123e1ef5407c178eb6454dc6e67"} Apr 24 16:39:38.088249 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:38.088197 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hhp56" podStartSLOduration=5.696286656 podStartE2EDuration="36.08817826s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.384210169 +0000 UTC m=+3.054841062" lastFinishedPulling="2026-04-24 16:39:34.776101787 +0000 UTC m=+33.446732666" observedRunningTime="2026-04-24 16:39:38.08736586 +0000 UTC m=+36.757996786" watchObservedRunningTime="2026-04-24 16:39:38.08817826 +0000 UTC m=+36.758809154" Apr 24 16:39:41.067748 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:41.067715 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6swzl" event={"ID":"c883a6e5-b74b-4d33-9372-a3da5fd267f6","Type":"ContainerStarted","Data":"89677ffc46d243e501d23d537261f8e9e8348cc5c35823d619a5d9b370d95140"} Apr 24 16:39:41.089056 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:41.089004 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6swzl" podStartSLOduration=17.190421598 podStartE2EDuration="21.088987857s" podCreationTimestamp="2026-04-24 16:39:20 +0000 UTC" firstStartedPulling="2026-04-24 16:39:36.338680659 +0000 UTC m=+35.009311541" lastFinishedPulling="2026-04-24 16:39:40.237246914 +0000 UTC m=+38.907877800" observedRunningTime="2026-04-24 16:39:41.088981427 +0000 UTC m=+39.759612327" watchObservedRunningTime="2026-04-24 16:39:41.088987857 +0000 UTC m=+39.759618759" Apr 24 16:39:41.969507 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:41.969481 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:41.969661 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:41.969620 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:41.969661 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:41.969635 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:39:41.969731 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:41.969678 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:49.969665309 +0000 UTC m=+48.640296189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:39:42.069964 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:42.069937 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:42.070341 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:42.069991 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:42.070341 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:42.070076 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:42.070341 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:42.070079 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:42.070341 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:42.070123 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:50.070110417 +0000 UTC m=+48.740741296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:39:42.070341 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:42.070153 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:50.070130234 +0000 UTC m=+48.740761113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:39:46.285119 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.285090 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9"] Apr 24 16:39:46.330330 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.330303 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8"] Apr 24 16:39:46.330478 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.330460 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" Apr 24 16:39:46.332966 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.332942 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 16:39:46.333102 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.333026 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 16:39:46.333204 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.333166 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4q5gf\"" Apr 24 16:39:46.333298 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.333281 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 16:39:46.333826 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.333810 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 16:39:46.351199 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.351178 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9"] Apr 24 16:39:46.351199 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.351201 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8"] Apr 24 16:39:46.351314 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.351211 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q"] Apr 24 16:39:46.351345 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.351320 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.355979 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.355961 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 16:39:46.368829 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.368807 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q"] Apr 24 16:39:46.368905 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.368895 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.371401 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.371386 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 16:39:46.371699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.371680 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 16:39:46.371833 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.371813 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 16:39:46.371939 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.371925 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 16:39:46.502118 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502085 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dd3c5a62-57fa-4943-8b21-975dd46640ad-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9\" (UID: \"dd3c5a62-57fa-4943-8b21-975dd46640ad\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" Apr 24 16:39:46.502118 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502125 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eee9b715-088d-45ee-b475-6b3eeb422603-tmp\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.502309 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502183 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/eee9b715-088d-45ee-b475-6b3eeb422603-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.502309 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502201 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn79m\" (UniqueName: \"kubernetes.io/projected/eee9b715-088d-45ee-b475-6b3eeb422603-kube-api-access-nn79m\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.502309 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502273 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.502309 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502303 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-ca\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.502436 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502320 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/13f93392-072f-43d6-839d-2c258de6c94b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.502436 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502338 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpzj\" (UniqueName: \"kubernetes.io/projected/13f93392-072f-43d6-839d-2c258de6c94b-kube-api-access-twpzj\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.502436 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502360 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-hub\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.502436 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502396 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.502436 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.502425 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdq8p\" (UniqueName: \"kubernetes.io/projected/dd3c5a62-57fa-4943-8b21-975dd46640ad-kube-api-access-jdq8p\") pod \"managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9\" (UID: \"dd3c5a62-57fa-4943-8b21-975dd46640ad\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" Apr 24 16:39:46.603306 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603242 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.603306 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603277 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-ca\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.603306 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603293 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/13f93392-072f-43d6-839d-2c258de6c94b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.603471 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603418 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twpzj\" (UniqueName: \"kubernetes.io/projected/13f93392-072f-43d6-839d-2c258de6c94b-kube-api-access-twpzj\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.603471 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603466 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-hub\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.603545 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603492 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.603639 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603620 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdq8p\" (UniqueName: \"kubernetes.io/projected/dd3c5a62-57fa-4943-8b21-975dd46640ad-kube-api-access-jdq8p\") pod \"managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9\" (UID: \"dd3c5a62-57fa-4943-8b21-975dd46640ad\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" Apr 24 16:39:46.603730 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603712 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dd3c5a62-57fa-4943-8b21-975dd46640ad-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9\" (UID: \"dd3c5a62-57fa-4943-8b21-975dd46640ad\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" Apr 24 16:39:46.603781 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603762 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eee9b715-088d-45ee-b475-6b3eeb422603-tmp\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.603833 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603806 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/eee9b715-088d-45ee-b475-6b3eeb422603-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.603889 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603841 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn79m\" (UniqueName: \"kubernetes.io/projected/eee9b715-088d-45ee-b475-6b3eeb422603-kube-api-access-nn79m\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.603938 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.603908 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/13f93392-072f-43d6-839d-2c258de6c94b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.604369 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.604314 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eee9b715-088d-45ee-b475-6b3eeb422603-tmp\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.608499 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.608474 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-ca\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.608599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.608476 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.608599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.608513 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dd3c5a62-57fa-4943-8b21-975dd46640ad-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9\" (UID: \"dd3c5a62-57fa-4943-8b21-975dd46640ad\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" Apr 24 16:39:46.608599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.608510 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.608599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.608552 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/13f93392-072f-43d6-839d-2c258de6c94b-hub\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.608599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.608580 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/eee9b715-088d-45ee-b475-6b3eeb422603-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.611309 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.611285 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpzj\" (UniqueName: \"kubernetes.io/projected/13f93392-072f-43d6-839d-2c258de6c94b-kube-api-access-twpzj\") pod \"cluster-proxy-proxy-agent-864b646dc6-9ph5q\" (UID: \"13f93392-072f-43d6-839d-2c258de6c94b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.611646 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.611629 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn79m\" (UniqueName: \"kubernetes.io/projected/eee9b715-088d-45ee-b475-6b3eeb422603-kube-api-access-nn79m\") pod \"klusterlet-addon-workmgr-5f7554cd7f-pbvs8\" (UID: \"eee9b715-088d-45ee-b475-6b3eeb422603\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.611686 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.611639 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdq8p\" (UniqueName: \"kubernetes.io/projected/dd3c5a62-57fa-4943-8b21-975dd46640ad-kube-api-access-jdq8p\") pod \"managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9\" (UID: \"dd3c5a62-57fa-4943-8b21-975dd46640ad\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" Apr 24 16:39:46.656655 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.656623 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" Apr 24 16:39:46.677560 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.677536 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:46.677668 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.677593 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:39:46.829361 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.829337 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8"] Apr 24 16:39:46.838544 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:46.838519 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee9b715_088d_45ee_b475_6b3eeb422603.slice/crio-7934ba13619bf477181a00278b9c70e0851eb09f6c2dde5c8b681bc1d5c32828 WatchSource:0}: Error finding container 7934ba13619bf477181a00278b9c70e0851eb09f6c2dde5c8b681bc1d5c32828: Status 404 returned error can't find the container with id 7934ba13619bf477181a00278b9c70e0851eb09f6c2dde5c8b681bc1d5c32828 Apr 24 16:39:46.850236 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:46.850209 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q"] Apr 24 16:39:46.853423 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:46.853372 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f93392_072f_43d6_839d_2c258de6c94b.slice/crio-96a0fa777e99d10eaf904a6fbaa9e8e8b6ab725effb72db544e524cd5092adac WatchSource:0}: Error finding container 96a0fa777e99d10eaf904a6fbaa9e8e8b6ab725effb72db544e524cd5092adac: Status 404 returned error can't find the container with id 96a0fa777e99d10eaf904a6fbaa9e8e8b6ab725effb72db544e524cd5092adac Apr 24 16:39:47.003989 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:47.003962 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9"] Apr 24 16:39:47.020677 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:39:47.020652 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd3c5a62_57fa_4943_8b21_975dd46640ad.slice/crio-88a8eed37d8e21b0d9b464dda7f0dbde942e00c01cb75985477435a0605300e4 WatchSource:0}: Error finding container 88a8eed37d8e21b0d9b464dda7f0dbde942e00c01cb75985477435a0605300e4: Status 404 returned error can't find the container with id 88a8eed37d8e21b0d9b464dda7f0dbde942e00c01cb75985477435a0605300e4 Apr 24 16:39:47.081148 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:47.081108 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" event={"ID":"dd3c5a62-57fa-4943-8b21-975dd46640ad","Type":"ContainerStarted","Data":"88a8eed37d8e21b0d9b464dda7f0dbde942e00c01cb75985477435a0605300e4"} Apr 24 16:39:47.082077 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:47.082041 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" event={"ID":"13f93392-072f-43d6-839d-2c258de6c94b","Type":"ContainerStarted","Data":"96a0fa777e99d10eaf904a6fbaa9e8e8b6ab725effb72db544e524cd5092adac"} Apr 24 16:39:47.083048 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:47.083028 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" event={"ID":"eee9b715-088d-45ee-b475-6b3eeb422603","Type":"ContainerStarted","Data":"7934ba13619bf477181a00278b9c70e0851eb09f6c2dde5c8b681bc1d5c32828"} Apr 24 16:39:50.032479 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:50.032449 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:39:50.032865 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:50.032606 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:50.032865 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:50.032624 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:39:50.032865 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:50.032677 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:40:06.032660963 +0000 UTC m=+64.703291853 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:39:50.132924 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:50.132891 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:39:50.133103 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:50.132983 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:39:50.133103 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:50.133048 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:50.133103 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:50.133103 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:06.133090467 +0000 UTC m=+64.803721349 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:39:50.133303 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:50.133128 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:50.133303 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:39:50.133232 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:06.133204662 +0000 UTC m=+64.803835550 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:39:53.097614 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:53.097579 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" event={"ID":"dd3c5a62-57fa-4943-8b21-975dd46640ad","Type":"ContainerStarted","Data":"737cd5fdd29bfcada18a15ae836f6c18fb7e5c5f9e772edbad9ba7e44c31856b"} Apr 24 16:39:53.098878 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:53.098855 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" event={"ID":"13f93392-072f-43d6-839d-2c258de6c94b","Type":"ContainerStarted","Data":"a06a7c8aa853536b4f2b228ae73a7666fc859c7e9940662ae8fb8126125ca775"} Apr 24 16:39:53.100028 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:53.100001 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" event={"ID":"eee9b715-088d-45ee-b475-6b3eeb422603","Type":"ContainerStarted","Data":"9b248064052b46c6d9ba64175a25939500c6141e45a91200500133172dfaab66"} Apr 24 16:39:53.100213 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:53.100198 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:53.101810 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:53.101794 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:39:53.112763 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:53.112729 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" podStartSLOduration=1.612735909 podStartE2EDuration="7.112719089s" podCreationTimestamp="2026-04-24 16:39:46 +0000 UTC" firstStartedPulling="2026-04-24 16:39:47.022590694 +0000 UTC m=+45.693221572" lastFinishedPulling="2026-04-24 16:39:52.522573873 +0000 UTC m=+51.193204752" observedRunningTime="2026-04-24 16:39:53.112485506 +0000 UTC m=+51.783116404" watchObservedRunningTime="2026-04-24 16:39:53.112719089 +0000 UTC m=+51.783349990" Apr 24 16:39:53.129894 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:53.129858 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" podStartSLOduration=1.43137495 podStartE2EDuration="7.129847408s" podCreationTimestamp="2026-04-24 16:39:46 +0000 UTC" firstStartedPulling="2026-04-24 16:39:46.840221605 +0000 UTC m=+45.510852485" lastFinishedPulling="2026-04-24 16:39:52.538694063 +0000 UTC m=+51.209324943" observedRunningTime="2026-04-24 16:39:53.129260461 +0000 UTC m=+51.799891365" watchObservedRunningTime="2026-04-24 16:39:53.129847408 +0000 UTC m=+51.800478287" Apr 24 16:39:55.106961 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:55.106923 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" event={"ID":"13f93392-072f-43d6-839d-2c258de6c94b","Type":"ContainerStarted","Data":"bd8f771979f23184f370b0a421f6c230c59c17a2625f6b302bdd0c7929e5d3a9"} Apr 24 16:39:55.107365 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:55.106968 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" event={"ID":"13f93392-072f-43d6-839d-2c258de6c94b","Type":"ContainerStarted","Data":"5afaab722218c3817835bba2d1f85695a534061991025c89b98723f2a3aceb97"} Apr 24 16:39:55.126336 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:39:55.126296 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" podStartSLOduration=1.359061841 podStartE2EDuration="9.126283693s" podCreationTimestamp="2026-04-24 16:39:46 +0000 UTC" firstStartedPulling="2026-04-24 16:39:46.855216987 +0000 UTC m=+45.525847866" lastFinishedPulling="2026-04-24 16:39:54.622438835 +0000 UTC m=+53.293069718" observedRunningTime="2026-04-24 16:39:55.125028268 +0000 UTC m=+53.795659169" watchObservedRunningTime="2026-04-24 16:39:55.126283693 +0000 UTC m=+53.796914593" Apr 24 16:40:00.042725 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:00.042697 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dmn7d" Apr 24 16:40:06.057968 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:06.057931 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:40:06.058374 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:06.058091 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:40:06.058374 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:06.058105 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:40:06.058374 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:06.058209 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:40:38.058189377 +0000 UTC m=+96.728820267 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:40:06.158699 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:06.158670 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:40:06.158854 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:06.158737 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:40:06.158854 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:06.158814 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:06.158854 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:06.158819 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:06.158990 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:06.158870 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:38.158856439 +0000 UTC m=+96.829487318 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:40:06.158990 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:06.158882 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:38.15887713 +0000 UTC m=+96.829508009 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:40:07.569481 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.569448 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:40:07.569936 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.569511 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:40:07.571883 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.571858 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:40:07.571994 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.571888 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:40:07.579810 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:07.579787 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:40:07.579929 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:07.579839 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:11.579824178 +0000 UTC m=+130.250455058 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : secret "metrics-daemon-secret" not found Apr 24 16:40:07.581898 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.581882 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:40:07.593059 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.593036 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bv7\" (UniqueName: \"kubernetes.io/projected/1fb076e8-3881-4444-98b6-2d67d3820579-kube-api-access-k5bv7\") pod \"network-check-target-spk8g\" (UID: \"1fb076e8-3881-4444-98b6-2d67d3820579\") " pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:40:07.679571 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.679548 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7ckwm\"" Apr 24 16:40:07.687353 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.687334 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:40:07.797776 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:07.797724 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-spk8g"] Apr 24 16:40:07.802899 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:40:07.802870 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fb076e8_3881_4444_98b6_2d67d3820579.slice/crio-cd35b86bcab1df9a8b70133c086deed1f8093dd14e8981ace8eb3125f781d948 WatchSource:0}: Error finding container cd35b86bcab1df9a8b70133c086deed1f8093dd14e8981ace8eb3125f781d948: Status 404 returned error can't find the container with id cd35b86bcab1df9a8b70133c086deed1f8093dd14e8981ace8eb3125f781d948 Apr 24 16:40:08.137987 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:08.137945 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-spk8g" event={"ID":"1fb076e8-3881-4444-98b6-2d67d3820579","Type":"ContainerStarted","Data":"cd35b86bcab1df9a8b70133c086deed1f8093dd14e8981ace8eb3125f781d948"} Apr 24 16:40:11.146582 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:11.146546 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-spk8g" event={"ID":"1fb076e8-3881-4444-98b6-2d67d3820579","Type":"ContainerStarted","Data":"c124df688de8a723fd3eaa05dc7cf32fca685df8e29cb56491e6636af5f8ce14"} Apr 24 16:40:11.146956 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:11.146683 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:40:11.162922 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:11.162880 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-spk8g" podStartSLOduration=66.621286803 podStartE2EDuration="1m9.16286917s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:40:07.804781809 +0000 UTC m=+66.475412688" lastFinishedPulling="2026-04-24 16:40:10.346364175 +0000 UTC m=+69.016995055" observedRunningTime="2026-04-24 16:40:11.16216303 +0000 UTC m=+69.832793933" watchObservedRunningTime="2026-04-24 16:40:11.16286917 +0000 UTC m=+69.833500086" Apr 24 16:40:38.079868 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:38.079777 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:40:38.080275 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:38.079928 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:40:38.080275 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:38.079945 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:40:38.080275 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:38.080023 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:42.080007176 +0000 UTC m=+160.750638055 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:40:38.180719 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:38.180693 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:40:38.180845 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:38.180748 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:40:38.180925 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:38.180842 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:38.180925 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:38.180899 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:42.18088485 +0000 UTC m=+160.851515734 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:40:38.181007 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:38.180842 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:38.181007 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:40:38.180967 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:42.18095524 +0000 UTC m=+160.851586119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:40:42.151260 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:40:42.151228 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-spk8g" Apr 24 16:41:11.614735 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:11.614684 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:41:11.615258 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:11.614836 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:41:11.615258 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:11.614909 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs podName:b98926f9-5237-4269-877b-422b4e1c6edf nodeName:}" failed. No retries permitted until 2026-04-24 16:43:13.614893761 +0000 UTC m=+252.285524640 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs") pod "network-metrics-daemon-bzld5" (UID: "b98926f9-5237-4269-877b-422b4e1c6edf") : secret "metrics-daemon-secret" not found Apr 24 16:41:37.240969 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:37.240932 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" podUID="2af4c218-5178-4a23-b52e-7e4972f6785c" Apr 24 16:41:37.254066 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:37.254041 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vwpz2" podUID="086b1d61-526e-4540-a577-70f6d4cc1109" Apr 24 16:41:37.271341 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:37.271312 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rcmmv" podUID="5f5bde11-c32a-402f-994e-143e03a8dd70" Apr 24 16:41:37.341314 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:37.341290 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vwpz2" Apr 24 16:41:38.244549 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:38.244522 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5dlzs_b514d98b-1969-443b-b7cf-8c931162148a/dns-node-resolver/0.log" Apr 24 16:41:38.872090 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:38.872053 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bzld5" podUID="b98926f9-5237-4269-877b-422b4e1c6edf" Apr 24 16:41:39.448525 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:39.448500 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j59hh_c532fe1e-4595-4088-b40a-8ee0058e4ccd/node-ca/0.log" Apr 24 16:41:42.130298 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:42.130270 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") pod \"image-registry-7f96c945bd-7bmhc\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:41:42.130721 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:42.130379 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:41:42.130721 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:42.130390 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f96c945bd-7bmhc: secret "image-registry-tls" not found Apr 24 16:41:42.130721 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:42.130432 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls podName:2af4c218-5178-4a23-b52e-7e4972f6785c nodeName:}" failed. No retries permitted until 2026-04-24 16:43:44.13042028 +0000 UTC m=+282.801051159 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls") pod "image-registry-7f96c945bd-7bmhc" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c") : secret "image-registry-tls" not found Apr 24 16:41:42.231510 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:42.231472 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:41:42.231685 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:42.231543 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:41:42.231685 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:42.231629 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:41:42.231798 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:42.231698 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls podName:086b1d61-526e-4540-a577-70f6d4cc1109 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:44.231678196 +0000 UTC m=+282.902309083 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls") pod "dns-default-vwpz2" (UID: "086b1d61-526e-4540-a577-70f6d4cc1109") : secret "dns-default-metrics-tls" not found Apr 24 16:41:42.231798 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:42.231637 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:41:42.231798 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:41:42.231765 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert podName:5f5bde11-c32a-402f-994e-143e03a8dd70 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:44.231753951 +0000 UTC m=+282.902384830 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert") pod "ingress-canary-rcmmv" (UID: "5f5bde11-c32a-402f-994e-143e03a8dd70") : secret "canary-serving-cert" not found Apr 24 16:41:49.854438 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:49.854397 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:41:50.854719 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:50.854627 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:41:52.855168 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:52.855122 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:41:53.100786 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:53.100726 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" podUID="eee9b715-088d-45ee-b475-6b3eeb422603" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 24 16:41:53.378785 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:53.378704 2563 generic.go:358] "Generic (PLEG): container finished" podID="eee9b715-088d-45ee-b475-6b3eeb422603" containerID="9b248064052b46c6d9ba64175a25939500c6141e45a91200500133172dfaab66" exitCode=1 Apr 24 16:41:53.378937 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:53.378783 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" event={"ID":"eee9b715-088d-45ee-b475-6b3eeb422603","Type":"ContainerDied","Data":"9b248064052b46c6d9ba64175a25939500c6141e45a91200500133172dfaab66"} Apr 24 16:41:53.379129 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:53.379112 2563 scope.go:117] "RemoveContainer" containerID="9b248064052b46c6d9ba64175a25939500c6141e45a91200500133172dfaab66" Apr 24 16:41:53.380054 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:53.380030 2563 generic.go:358] "Generic (PLEG): container finished" podID="dd3c5a62-57fa-4943-8b21-975dd46640ad" containerID="737cd5fdd29bfcada18a15ae836f6c18fb7e5c5f9e772edbad9ba7e44c31856b" exitCode=255 Apr 24 16:41:53.380173 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:53.380070 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" event={"ID":"dd3c5a62-57fa-4943-8b21-975dd46640ad","Type":"ContainerDied","Data":"737cd5fdd29bfcada18a15ae836f6c18fb7e5c5f9e772edbad9ba7e44c31856b"} Apr 24 16:41:53.380406 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:53.380332 2563 scope.go:117] "RemoveContainer" containerID="737cd5fdd29bfcada18a15ae836f6c18fb7e5c5f9e772edbad9ba7e44c31856b" Apr 24 16:41:54.383504 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:54.383471 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6cf99ff4c-ddsv9" event={"ID":"dd3c5a62-57fa-4943-8b21-975dd46640ad","Type":"ContainerStarted","Data":"c4bb77224c38d30ac650700683f76862a4873f99e2cb94f3006859e4720765b9"} Apr 24 16:41:54.384936 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:54.384918 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" event={"ID":"eee9b715-088d-45ee-b475-6b3eeb422603","Type":"ContainerStarted","Data":"54db4eb6b1e32784e3bb8e146b88ac84a37a95e5e1c7d20c2872cbf6e29d4d50"} Apr 24 16:41:54.385172 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:54.385157 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:41:54.385652 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:41:54.385639 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f7554cd7f-pbvs8" Apr 24 16:42:01.431778 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.431745 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fkrh6"] Apr 24 16:42:01.434856 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.434841 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.437230 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.437205 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:42:01.438005 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.437981 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jgncl\"" Apr 24 16:42:01.438307 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.438285 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:42:01.438412 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.438314 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:42:01.438412 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.438313 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:42:01.446290 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.446269 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fkrh6"] Apr 24 16:42:01.577056 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.577028 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f99073ce-374c-4453-a1aa-6ee180e85c92-data-volume\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.577187 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.577082 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f99073ce-374c-4453-a1aa-6ee180e85c92-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.577187 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.577102 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f99073ce-374c-4453-a1aa-6ee180e85c92-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.577265 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.577203 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpcv\" (UniqueName: \"kubernetes.io/projected/f99073ce-374c-4453-a1aa-6ee180e85c92-kube-api-access-8qpcv\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.577265 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.577239 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f99073ce-374c-4453-a1aa-6ee180e85c92-crio-socket\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.678489 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.678461 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpcv\" (UniqueName: \"kubernetes.io/projected/f99073ce-374c-4453-a1aa-6ee180e85c92-kube-api-access-8qpcv\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.678623 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.678502 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f99073ce-374c-4453-a1aa-6ee180e85c92-crio-socket\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.678623 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.678529 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f99073ce-374c-4453-a1aa-6ee180e85c92-data-volume\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.678623 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.678605 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f99073ce-374c-4453-a1aa-6ee180e85c92-crio-socket\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.678790 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.678673 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f99073ce-374c-4453-a1aa-6ee180e85c92-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.678790 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.678712 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f99073ce-374c-4453-a1aa-6ee180e85c92-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.678863 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.678792 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f99073ce-374c-4453-a1aa-6ee180e85c92-data-volume\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.679126 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.679104 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f99073ce-374c-4453-a1aa-6ee180e85c92-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.680886 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.680869 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f99073ce-374c-4453-a1aa-6ee180e85c92-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.686261 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.686216 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpcv\" (UniqueName: \"kubernetes.io/projected/f99073ce-374c-4453-a1aa-6ee180e85c92-kube-api-access-8qpcv\") pod \"insights-runtime-extractor-fkrh6\" (UID: \"f99073ce-374c-4453-a1aa-6ee180e85c92\") " pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.745101 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.745075 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jgncl\"" Apr 24 16:42:01.754128 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.754107 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fkrh6" Apr 24 16:42:01.870236 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:01.870208 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fkrh6"] Apr 24 16:42:01.873461 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:42:01.873435 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf99073ce_374c_4453_a1aa_6ee180e85c92.slice/crio-c6c189f9fa8e5eaf1dd742f7c50ff8aa75e57b6b011bc8edd0948f8e72c0ce85 WatchSource:0}: Error finding container c6c189f9fa8e5eaf1dd742f7c50ff8aa75e57b6b011bc8edd0948f8e72c0ce85: Status 404 returned error can't find the container with id c6c189f9fa8e5eaf1dd742f7c50ff8aa75e57b6b011bc8edd0948f8e72c0ce85 Apr 24 16:42:02.407826 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:02.407792 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fkrh6" event={"ID":"f99073ce-374c-4453-a1aa-6ee180e85c92","Type":"ContainerStarted","Data":"5105e8b773cb19b01e1197984a1607e46d3a4a4de34cdfdfa0d26acc69caed79"} Apr 24 16:42:02.407826 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:02.407830 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fkrh6" event={"ID":"f99073ce-374c-4453-a1aa-6ee180e85c92","Type":"ContainerStarted","Data":"c6c189f9fa8e5eaf1dd742f7c50ff8aa75e57b6b011bc8edd0948f8e72c0ce85"} Apr 24 16:42:03.412570 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:03.412540 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fkrh6" event={"ID":"f99073ce-374c-4453-a1aa-6ee180e85c92","Type":"ContainerStarted","Data":"b30a37a549a727859f0caf19d08a6e92224c83a47bba2927384427bde21f932e"} Apr 24 16:42:04.418229 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:04.418189 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fkrh6" event={"ID":"f99073ce-374c-4453-a1aa-6ee180e85c92","Type":"ContainerStarted","Data":"1fe4bea0b26eeb8774701323b41ccc1dbadb251f5a372e403aee45d5d3d66e15"} Apr 24 16:42:04.437335 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:04.437292 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fkrh6" podStartSLOduration=1.4817734200000001 podStartE2EDuration="3.437278465s" podCreationTimestamp="2026-04-24 16:42:01 +0000 UTC" firstStartedPulling="2026-04-24 16:42:01.925074168 +0000 UTC m=+180.595705047" lastFinishedPulling="2026-04-24 16:42:03.880579201 +0000 UTC m=+182.551210092" observedRunningTime="2026-04-24 16:42:04.436415105 +0000 UTC m=+183.107046006" watchObservedRunningTime="2026-04-24 16:42:04.437278465 +0000 UTC m=+183.107909365" Apr 24 16:42:13.404774 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.404744 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7v47c"] Apr 24 16:42:13.409252 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.409224 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.411504 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.411484 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:42:13.411906 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.411887 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:42:13.412001 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.411934 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:42:13.412415 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.412396 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4464q\"" Apr 24 16:42:13.412522 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.412401 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:42:13.412644 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.412630 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:42:13.412706 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.412692 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:42:13.563871 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.563837 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-accelerators-collector-config\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.564017 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.563906 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-wtmp\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.564017 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.563938 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsk95\" (UniqueName: \"kubernetes.io/projected/7c06394a-d2a9-474f-8448-b491ea74f9df-kube-api-access-tsk95\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.564017 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.563978 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.564226 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.564019 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-root\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.564226 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.564046 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-tls\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.564226 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.564062 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c06394a-d2a9-474f-8448-b491ea74f9df-metrics-client-ca\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.564226 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.564187 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-sys\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.564373 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.564237 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-textfile\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665603 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665532 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-wtmp\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665603 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665568 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsk95\" (UniqueName: \"kubernetes.io/projected/7c06394a-d2a9-474f-8448-b491ea74f9df-kube-api-access-tsk95\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665605 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665634 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-root\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665652 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-tls\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665688 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c06394a-d2a9-474f-8448-b491ea74f9df-metrics-client-ca\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665699 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-wtmp\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665746 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-sys\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.665814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665790 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-textfile\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.666091 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665822 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-accelerators-collector-config\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.666091 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665829 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-sys\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.666091 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.665746 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c06394a-d2a9-474f-8448-b491ea74f9df-root\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.666271 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.666190 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-textfile\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.666403 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.666378 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c06394a-d2a9-474f-8448-b491ea74f9df-metrics-client-ca\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.666533 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.666479 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-accelerators-collector-config\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.668047 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.668022 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-tls\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.668173 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.668089 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c06394a-d2a9-474f-8448-b491ea74f9df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.680275 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.680255 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsk95\" (UniqueName: \"kubernetes.io/projected/7c06394a-d2a9-474f-8448-b491ea74f9df-kube-api-access-tsk95\") pod \"node-exporter-7v47c\" (UID: \"7c06394a-d2a9-474f-8448-b491ea74f9df\") " pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.717468 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:13.717447 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7v47c" Apr 24 16:42:13.725387 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:42:13.725350 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c06394a_d2a9_474f_8448_b491ea74f9df.slice/crio-446c9305dc06168592551b82470e1e6c1a4133cc12e956dee81bd89ddbbaab68 WatchSource:0}: Error finding container 446c9305dc06168592551b82470e1e6c1a4133cc12e956dee81bd89ddbbaab68: Status 404 returned error can't find the container with id 446c9305dc06168592551b82470e1e6c1a4133cc12e956dee81bd89ddbbaab68 Apr 24 16:42:14.442702 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:14.442674 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7v47c" event={"ID":"7c06394a-d2a9-474f-8448-b491ea74f9df","Type":"ContainerStarted","Data":"446c9305dc06168592551b82470e1e6c1a4133cc12e956dee81bd89ddbbaab68"} Apr 24 16:42:15.446048 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:15.446013 2563 generic.go:358] "Generic (PLEG): container finished" podID="7c06394a-d2a9-474f-8448-b491ea74f9df" containerID="c4b81f54d1800fbd5c75557d2eb86b623bb7d2e95acd5fdcc97bbb25d9ad6288" exitCode=0 Apr 24 16:42:15.446439 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:15.446071 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7v47c" event={"ID":"7c06394a-d2a9-474f-8448-b491ea74f9df","Type":"ContainerDied","Data":"c4b81f54d1800fbd5c75557d2eb86b623bb7d2e95acd5fdcc97bbb25d9ad6288"} Apr 24 16:42:16.450608 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:16.450573 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7v47c" event={"ID":"7c06394a-d2a9-474f-8448-b491ea74f9df","Type":"ContainerStarted","Data":"e45f5567f5169ba44b5b78ed525d82d12c450bd7f35fcdbbffb305f9636b772a"} Apr 24 16:42:16.450608 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:16.450608 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7v47c" event={"ID":"7c06394a-d2a9-474f-8448-b491ea74f9df","Type":"ContainerStarted","Data":"fa3f6c04c4bfbdc40137808f16880a1b68632a577389219ab40e7658cd846288"} Apr 24 16:42:16.471387 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:16.471333 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7v47c" podStartSLOduration=2.762403782 podStartE2EDuration="3.47132113s" podCreationTimestamp="2026-04-24 16:42:13 +0000 UTC" firstStartedPulling="2026-04-24 16:42:13.727084336 +0000 UTC m=+192.397715215" lastFinishedPulling="2026-04-24 16:42:14.436001667 +0000 UTC m=+193.106632563" observedRunningTime="2026-04-24 16:42:16.469620989 +0000 UTC m=+195.140251890" watchObservedRunningTime="2026-04-24 16:42:16.47132113 +0000 UTC m=+195.141952031" Apr 24 16:42:23.367831 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.367795 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f96c945bd-7bmhc"] Apr 24 16:42:23.368206 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:42:23.367964 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" podUID="2af4c218-5178-4a23-b52e-7e4972f6785c" Apr 24 16:42:23.467834 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.467806 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:42:23.471795 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.471775 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:42:23.645780 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.645710 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-bound-sa-token\") pod \"2af4c218-5178-4a23-b52e-7e4972f6785c\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " Apr 24 16:42:23.645780 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.645752 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-installation-pull-secrets\") pod \"2af4c218-5178-4a23-b52e-7e4972f6785c\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " Apr 24 16:42:23.645978 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.645791 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-trusted-ca\") pod \"2af4c218-5178-4a23-b52e-7e4972f6785c\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " Apr 24 16:42:23.645978 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.645808 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2af4c218-5178-4a23-b52e-7e4972f6785c-ca-trust-extracted\") pod \"2af4c218-5178-4a23-b52e-7e4972f6785c\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " Apr 24 16:42:23.645978 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.645824 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czt5x\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-kube-api-access-czt5x\") pod \"2af4c218-5178-4a23-b52e-7e4972f6785c\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " Apr 24 16:42:23.645978 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.645850 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-image-registry-private-configuration\") pod \"2af4c218-5178-4a23-b52e-7e4972f6785c\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " Apr 24 16:42:23.645978 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.645873 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-certificates\") pod \"2af4c218-5178-4a23-b52e-7e4972f6785c\" (UID: \"2af4c218-5178-4a23-b52e-7e4972f6785c\") " Apr 24 16:42:23.646310 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.646260 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af4c218-5178-4a23-b52e-7e4972f6785c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2af4c218-5178-4a23-b52e-7e4972f6785c" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:42:23.646404 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.646379 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2af4c218-5178-4a23-b52e-7e4972f6785c" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:23.646544 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.646515 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2af4c218-5178-4a23-b52e-7e4972f6785c" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:23.648174 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.648127 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2af4c218-5178-4a23-b52e-7e4972f6785c" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:23.648277 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.648162 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-kube-api-access-czt5x" (OuterVolumeSpecName: "kube-api-access-czt5x") pod "2af4c218-5178-4a23-b52e-7e4972f6785c" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c"). InnerVolumeSpecName "kube-api-access-czt5x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:23.648344 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.648289 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2af4c218-5178-4a23-b52e-7e4972f6785c" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:23.648403 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.648384 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2af4c218-5178-4a23-b52e-7e4972f6785c" (UID: "2af4c218-5178-4a23-b52e-7e4972f6785c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:23.747163 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.747122 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-trusted-ca\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:42:23.747163 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.747160 2563 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2af4c218-5178-4a23-b52e-7e4972f6785c-ca-trust-extracted\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:42:23.747336 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.747171 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-czt5x\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-kube-api-access-czt5x\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:42:23.747336 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.747181 2563 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-image-registry-private-configuration\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:42:23.747336 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.747195 2563 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-certificates\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:42:23.747336 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.747208 2563 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-bound-sa-token\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:42:23.747336 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:23.747219 2563 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2af4c218-5178-4a23-b52e-7e4972f6785c-installation-pull-secrets\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:42:24.470524 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:24.470495 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f96c945bd-7bmhc" Apr 24 16:42:24.504836 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:24.504808 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f96c945bd-7bmhc"] Apr 24 16:42:24.508849 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:24.508825 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7f96c945bd-7bmhc"] Apr 24 16:42:24.653122 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:24.653086 2563 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2af4c218-5178-4a23-b52e-7e4972f6785c-registry-tls\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:42:25.858198 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:25.858166 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af4c218-5178-4a23-b52e-7e4972f6785c" path="/var/lib/kubelet/pods/2af4c218-5178-4a23-b52e-7e4972f6785c/volumes" Apr 24 16:42:41.293811 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.293778 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c54bd79dd-mzrps"] Apr 24 16:42:41.296434 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.296415 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.299079 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.299058 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:42:41.299221 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.299203 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ctfdb\"" Apr 24 16:42:41.299292 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.299279 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:42:41.300093 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.300068 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:42:41.300093 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.300087 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:42:41.300272 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.300101 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:42:41.300272 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.300092 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:42:41.300272 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.300068 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:42:41.305508 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.305490 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:42:41.309015 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.308996 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c54bd79dd-mzrps"] Apr 24 16:42:41.360375 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.360348 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-console-config\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.360491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.360383 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-trusted-ca-bundle\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.360491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.360400 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-oauth-config\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.360491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.360445 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-oauth-serving-cert\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.360491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.360478 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-service-ca\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.360637 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.360562 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756m9\" (UniqueName: \"kubernetes.io/projected/89bc5cdc-1e79-447c-ac9d-917132630c70-kube-api-access-756m9\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.360637 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.360596 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-serving-cert\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.461783 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.461738 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-756m9\" (UniqueName: \"kubernetes.io/projected/89bc5cdc-1e79-447c-ac9d-917132630c70-kube-api-access-756m9\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.461783 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.461789 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-serving-cert\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462022 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.461809 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-console-config\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462022 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.461834 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-trusted-ca-bundle\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462022 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.461849 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-oauth-config\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462022 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.461881 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-oauth-serving-cert\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462022 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.461908 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-service-ca\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462551 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.462525 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-console-config\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462654 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.462638 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-service-ca\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462703 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.462649 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-oauth-serving-cert\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.462980 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.462955 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-trusted-ca-bundle\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.464342 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.464322 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-serving-cert\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.464421 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.464351 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-oauth-config\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.471634 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.471612 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-756m9\" (UniqueName: \"kubernetes.io/projected/89bc5cdc-1e79-447c-ac9d-917132630c70-kube-api-access-756m9\") pod \"console-6c54bd79dd-mzrps\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.605582 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.605522 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:41.718567 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:41.718539 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c54bd79dd-mzrps"] Apr 24 16:42:41.721267 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:42:41.721242 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89bc5cdc_1e79_447c_ac9d_917132630c70.slice/crio-2a751826ba4b23fc7f659d38da046b2539e908a329d6a5a14b1c30d9dde5b8be WatchSource:0}: Error finding container 2a751826ba4b23fc7f659d38da046b2539e908a329d6a5a14b1c30d9dde5b8be: Status 404 returned error can't find the container with id 2a751826ba4b23fc7f659d38da046b2539e908a329d6a5a14b1c30d9dde5b8be Apr 24 16:42:42.524679 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:42.524640 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c54bd79dd-mzrps" event={"ID":"89bc5cdc-1e79-447c-ac9d-917132630c70","Type":"ContainerStarted","Data":"2a751826ba4b23fc7f659d38da046b2539e908a329d6a5a14b1c30d9dde5b8be"} Apr 24 16:42:44.531164 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:44.531113 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c54bd79dd-mzrps" event={"ID":"89bc5cdc-1e79-447c-ac9d-917132630c70","Type":"ContainerStarted","Data":"69942fe9b7c247af479fc33ab679c9eeacfc2afd1d7a9a16b586b7804c9beb17"} Apr 24 16:42:44.549845 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:44.549799 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c54bd79dd-mzrps" podStartSLOduration=0.881549378 podStartE2EDuration="3.549786222s" podCreationTimestamp="2026-04-24 16:42:41 +0000 UTC" firstStartedPulling="2026-04-24 16:42:41.72305407 +0000 UTC m=+220.393684949" lastFinishedPulling="2026-04-24 16:42:44.391290899 +0000 UTC m=+223.061921793" observedRunningTime="2026-04-24 16:42:44.548407756 +0000 UTC m=+223.219038656" watchObservedRunningTime="2026-04-24 16:42:44.549786222 +0000 UTC m=+223.220417123" Apr 24 16:42:46.679494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:46.679455 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" podUID="13f93392-072f-43d6-839d-2c258de6c94b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:42:51.606075 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:51.606045 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:51.606463 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:51.606124 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:51.610505 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:51.610485 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:52.554869 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:52.554843 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:42:56.679384 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:42:56.679347 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" podUID="13f93392-072f-43d6-839d-2c258de6c94b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:43:06.679434 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:06.679394 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" podUID="13f93392-072f-43d6-839d-2c258de6c94b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:43:06.679905 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:06.679483 2563 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" Apr 24 16:43:06.680093 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:06.680058 2563 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"bd8f771979f23184f370b0a421f6c230c59c17a2625f6b302bdd0c7929e5d3a9"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 16:43:06.680185 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:06.680149 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" podUID="13f93392-072f-43d6-839d-2c258de6c94b" containerName="service-proxy" containerID="cri-o://bd8f771979f23184f370b0a421f6c230c59c17a2625f6b302bdd0c7929e5d3a9" gracePeriod=30 Apr 24 16:43:07.586527 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:07.586493 2563 generic.go:358] "Generic (PLEG): container finished" podID="13f93392-072f-43d6-839d-2c258de6c94b" containerID="bd8f771979f23184f370b0a421f6c230c59c17a2625f6b302bdd0c7929e5d3a9" exitCode=2 Apr 24 16:43:07.586695 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:07.586537 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" event={"ID":"13f93392-072f-43d6-839d-2c258de6c94b","Type":"ContainerDied","Data":"bd8f771979f23184f370b0a421f6c230c59c17a2625f6b302bdd0c7929e5d3a9"} Apr 24 16:43:07.586695 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:07.586569 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-864b646dc6-9ph5q" event={"ID":"13f93392-072f-43d6-839d-2c258de6c94b","Type":"ContainerStarted","Data":"d5df748a2794662200503bbf1a6a7ec54bbbd4dc1d45fef3f4d90107d26879a2"} Apr 24 16:43:10.707927 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:10.707900 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7v47c_7c06394a-d2a9-474f-8448-b491ea74f9df/init-textfile/0.log" Apr 24 16:43:10.896400 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:10.896372 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7v47c_7c06394a-d2a9-474f-8448-b491ea74f9df/node-exporter/0.log" Apr 24 16:43:11.095452 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:11.095376 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7v47c_7c06394a-d2a9-474f-8448-b491ea74f9df/kube-rbac-proxy/0.log" Apr 24 16:43:13.698735 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:13.698695 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:43:13.700926 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:13.700906 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b98926f9-5237-4269-877b-422b4e1c6edf-metrics-certs\") pod \"network-metrics-daemon-bzld5\" (UID: \"b98926f9-5237-4269-877b-422b4e1c6edf\") " pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:43:13.958471 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:13.958394 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-czswq\"" Apr 24 16:43:13.966337 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:13.966320 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bzld5" Apr 24 16:43:14.088986 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:14.088958 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bzld5"] Apr 24 16:43:14.092231 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:43:14.092204 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb98926f9_5237_4269_877b_422b4e1c6edf.slice/crio-067c9dc67272ccddddb283fbfb7328c0c2e3d874b0749e099a352d1f25e62dd9 WatchSource:0}: Error finding container 067c9dc67272ccddddb283fbfb7328c0c2e3d874b0749e099a352d1f25e62dd9: Status 404 returned error can't find the container with id 067c9dc67272ccddddb283fbfb7328c0c2e3d874b0749e099a352d1f25e62dd9 Apr 24 16:43:14.604488 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:14.604453 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzld5" event={"ID":"b98926f9-5237-4269-877b-422b4e1c6edf","Type":"ContainerStarted","Data":"067c9dc67272ccddddb283fbfb7328c0c2e3d874b0749e099a352d1f25e62dd9"} Apr 24 16:43:15.608424 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:15.608388 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzld5" event={"ID":"b98926f9-5237-4269-877b-422b4e1c6edf","Type":"ContainerStarted","Data":"bc931558491fc9cc7c9c2bce9b1e65e3a7c705a37b534378c3d0ae88b77ce362"} Apr 24 16:43:15.608424 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:15.608424 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bzld5" event={"ID":"b98926f9-5237-4269-877b-422b4e1c6edf","Type":"ContainerStarted","Data":"bf94e290cde3740eb7be2149bbbb74983743cb2861161cb09f8e315eb313ef0b"} Apr 24 16:43:15.623963 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:15.623771 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bzld5" podStartSLOduration=252.346187535 podStartE2EDuration="4m13.623754088s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:43:14.093925686 +0000 UTC m=+252.764556564" lastFinishedPulling="2026-04-24 16:43:15.371492234 +0000 UTC m=+254.042123117" observedRunningTime="2026-04-24 16:43:15.623288535 +0000 UTC m=+254.293919437" watchObservedRunningTime="2026-04-24 16:43:15.623754088 +0000 UTC m=+254.294384989" Apr 24 16:43:15.701685 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:15.701656 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c54bd79dd-mzrps_89bc5cdc-1e79-447c-ac9d-917132630c70/console/0.log" Apr 24 16:43:40.341843 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:43:40.341792 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vwpz2" podUID="086b1d61-526e-4540-a577-70f6d4cc1109" Apr 24 16:43:40.678509 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:40.678421 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vwpz2" Apr 24 16:43:44.312612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.312559 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:43:44.312612 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.312617 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:43:44.314900 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.314877 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f5bde11-c32a-402f-994e-143e03a8dd70-cert\") pod \"ingress-canary-rcmmv\" (UID: \"5f5bde11-c32a-402f-994e-143e03a8dd70\") " pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:43:44.315415 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.315386 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/086b1d61-526e-4540-a577-70f6d4cc1109-metrics-tls\") pod \"dns-default-vwpz2\" (UID: \"086b1d61-526e-4540-a577-70f6d4cc1109\") " pod="openshift-dns/dns-default-vwpz2" Apr 24 16:43:44.458795 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.458768 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2lclw\"" Apr 24 16:43:44.466859 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.466841 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rcmmv" Apr 24 16:43:44.585293 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.585233 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ljxll\"" Apr 24 16:43:44.589513 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.589497 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vwpz2" Apr 24 16:43:44.604180 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.604157 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rcmmv"] Apr 24 16:43:44.608611 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:43:44.608588 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f5bde11_c32a_402f_994e_143e03a8dd70.slice/crio-19735eec19185178a7bd44ab2ddf19f97aec05414a49f0706728642fc79ece31 WatchSource:0}: Error finding container 19735eec19185178a7bd44ab2ddf19f97aec05414a49f0706728642fc79ece31: Status 404 returned error can't find the container with id 19735eec19185178a7bd44ab2ddf19f97aec05414a49f0706728642fc79ece31 Apr 24 16:43:44.689408 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.689375 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rcmmv" event={"ID":"5f5bde11-c32a-402f-994e-143e03a8dd70","Type":"ContainerStarted","Data":"19735eec19185178a7bd44ab2ddf19f97aec05414a49f0706728642fc79ece31"} Apr 24 16:43:44.721650 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:44.721624 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vwpz2"] Apr 24 16:43:44.725603 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:43:44.725582 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod086b1d61_526e_4540_a577_70f6d4cc1109.slice/crio-a9914d56c715d7d484412d0e82ee8851dbd6fdf5f1e0053cdfa7a8f5b7f5b3e3 WatchSource:0}: Error finding container a9914d56c715d7d484412d0e82ee8851dbd6fdf5f1e0053cdfa7a8f5b7f5b3e3: Status 404 returned error can't find the container with id a9914d56c715d7d484412d0e82ee8851dbd6fdf5f1e0053cdfa7a8f5b7f5b3e3 Apr 24 16:43:45.692988 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:45.692948 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vwpz2" event={"ID":"086b1d61-526e-4540-a577-70f6d4cc1109","Type":"ContainerStarted","Data":"a9914d56c715d7d484412d0e82ee8851dbd6fdf5f1e0053cdfa7a8f5b7f5b3e3"} Apr 24 16:43:46.697933 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:46.697900 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rcmmv" event={"ID":"5f5bde11-c32a-402f-994e-143e03a8dd70","Type":"ContainerStarted","Data":"7da28ce7589ded289fd8170338ed4da1b5010467284c1360e9f5ab2aaf891728"} Apr 24 16:43:46.699691 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:46.699664 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vwpz2" event={"ID":"086b1d61-526e-4540-a577-70f6d4cc1109","Type":"ContainerStarted","Data":"9bf579a793662f56f246b9f7e1bd9a95d124deddd5091be30692e3a2c509d947"} Apr 24 16:43:46.715342 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:46.715283 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rcmmv" podStartSLOduration=250.851916719 podStartE2EDuration="4m12.715267486s" podCreationTimestamp="2026-04-24 16:39:34 +0000 UTC" firstStartedPulling="2026-04-24 16:43:44.610416978 +0000 UTC m=+283.281047860" lastFinishedPulling="2026-04-24 16:43:46.473767737 +0000 UTC m=+285.144398627" observedRunningTime="2026-04-24 16:43:46.714982527 +0000 UTC m=+285.385613428" watchObservedRunningTime="2026-04-24 16:43:46.715267486 +0000 UTC m=+285.385898390" Apr 24 16:43:47.704123 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:47.704090 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vwpz2" event={"ID":"086b1d61-526e-4540-a577-70f6d4cc1109","Type":"ContainerStarted","Data":"2979d305693b238c8d4faeb1b87c3d53b8b26950d2ecbd2a55c0e32aefc3f22a"} Apr 24 16:43:47.723460 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:47.723419 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vwpz2" podStartSLOduration=251.980495577 podStartE2EDuration="4m13.723403381s" podCreationTimestamp="2026-04-24 16:39:34 +0000 UTC" firstStartedPulling="2026-04-24 16:43:44.727270419 +0000 UTC m=+283.397901298" lastFinishedPulling="2026-04-24 16:43:46.47017822 +0000 UTC m=+285.140809102" observedRunningTime="2026-04-24 16:43:47.721917171 +0000 UTC m=+286.392548072" watchObservedRunningTime="2026-04-24 16:43:47.723403381 +0000 UTC m=+286.394034282" Apr 24 16:43:48.706986 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:48.706950 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vwpz2" Apr 24 16:43:52.579557 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:52.579521 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c54bd79dd-mzrps"] Apr 24 16:43:58.711292 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:43:58.711259 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vwpz2" Apr 24 16:44:01.775353 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:01.775327 2563 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:44:17.598025 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:17.597957 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c54bd79dd-mzrps" podUID="89bc5cdc-1e79-447c-ac9d-917132630c70" containerName="console" containerID="cri-o://69942fe9b7c247af479fc33ab679c9eeacfc2afd1d7a9a16b586b7804c9beb17" gracePeriod=15 Apr 24 16:44:17.783857 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:17.783833 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c54bd79dd-mzrps_89bc5cdc-1e79-447c-ac9d-917132630c70/console/0.log" Apr 24 16:44:17.783986 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:17.783872 2563 generic.go:358] "Generic (PLEG): container finished" podID="89bc5cdc-1e79-447c-ac9d-917132630c70" containerID="69942fe9b7c247af479fc33ab679c9eeacfc2afd1d7a9a16b586b7804c9beb17" exitCode=2 Apr 24 16:44:17.783986 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:17.783924 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c54bd79dd-mzrps" event={"ID":"89bc5cdc-1e79-447c-ac9d-917132630c70","Type":"ContainerDied","Data":"69942fe9b7c247af479fc33ab679c9eeacfc2afd1d7a9a16b586b7804c9beb17"} Apr 24 16:44:17.830970 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:17.830948 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c54bd79dd-mzrps_89bc5cdc-1e79-447c-ac9d-917132630c70/console/0.log" Apr 24 16:44:17.831102 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:17.831018 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:44:18.026883 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.026852 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-console-config\") pod \"89bc5cdc-1e79-447c-ac9d-917132630c70\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " Apr 24 16:44:18.027065 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.026893 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-oauth-serving-cert\") pod \"89bc5cdc-1e79-447c-ac9d-917132630c70\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " Apr 24 16:44:18.027065 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.026921 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-service-ca\") pod \"89bc5cdc-1e79-447c-ac9d-917132630c70\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " Apr 24 16:44:18.027223 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027071 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-serving-cert\") pod \"89bc5cdc-1e79-447c-ac9d-917132630c70\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " Apr 24 16:44:18.027223 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027169 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-trusted-ca-bundle\") pod \"89bc5cdc-1e79-447c-ac9d-917132630c70\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " Apr 24 16:44:18.027223 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027213 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756m9\" (UniqueName: \"kubernetes.io/projected/89bc5cdc-1e79-447c-ac9d-917132630c70-kube-api-access-756m9\") pod \"89bc5cdc-1e79-447c-ac9d-917132630c70\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " Apr 24 16:44:18.027369 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027247 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-oauth-config\") pod \"89bc5cdc-1e79-447c-ac9d-917132630c70\" (UID: \"89bc5cdc-1e79-447c-ac9d-917132630c70\") " Apr 24 16:44:18.027420 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027354 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-console-config" (OuterVolumeSpecName: "console-config") pod "89bc5cdc-1e79-447c-ac9d-917132630c70" (UID: "89bc5cdc-1e79-447c-ac9d-917132630c70"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:18.027479 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027464 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-console-config\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:44:18.027576 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027543 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "89bc5cdc-1e79-447c-ac9d-917132630c70" (UID: "89bc5cdc-1e79-447c-ac9d-917132630c70"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:18.027636 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027554 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "89bc5cdc-1e79-447c-ac9d-917132630c70" (UID: "89bc5cdc-1e79-447c-ac9d-917132630c70"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:18.027701 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.027662 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-service-ca" (OuterVolumeSpecName: "service-ca") pod "89bc5cdc-1e79-447c-ac9d-917132630c70" (UID: "89bc5cdc-1e79-447c-ac9d-917132630c70"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:18.029484 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.029454 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "89bc5cdc-1e79-447c-ac9d-917132630c70" (UID: "89bc5cdc-1e79-447c-ac9d-917132630c70"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:18.029598 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.029496 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "89bc5cdc-1e79-447c-ac9d-917132630c70" (UID: "89bc5cdc-1e79-447c-ac9d-917132630c70"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:18.029598 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.029531 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89bc5cdc-1e79-447c-ac9d-917132630c70-kube-api-access-756m9" (OuterVolumeSpecName: "kube-api-access-756m9") pod "89bc5cdc-1e79-447c-ac9d-917132630c70" (UID: "89bc5cdc-1e79-447c-ac9d-917132630c70"). InnerVolumeSpecName "kube-api-access-756m9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:44:18.128450 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.128415 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-serving-cert\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:44:18.128450 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.128448 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-trusted-ca-bundle\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:44:18.128450 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.128458 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-756m9\" (UniqueName: \"kubernetes.io/projected/89bc5cdc-1e79-447c-ac9d-917132630c70-kube-api-access-756m9\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:44:18.128627 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.128468 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89bc5cdc-1e79-447c-ac9d-917132630c70-console-oauth-config\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:44:18.128627 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.128477 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-oauth-serving-cert\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:44:18.128627 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.128486 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89bc5cdc-1e79-447c-ac9d-917132630c70-service-ca\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:44:18.787704 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.787680 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c54bd79dd-mzrps_89bc5cdc-1e79-447c-ac9d-917132630c70/console/0.log" Apr 24 16:44:18.788084 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.787742 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c54bd79dd-mzrps" event={"ID":"89bc5cdc-1e79-447c-ac9d-917132630c70","Type":"ContainerDied","Data":"2a751826ba4b23fc7f659d38da046b2539e908a329d6a5a14b1c30d9dde5b8be"} Apr 24 16:44:18.788084 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.787773 2563 scope.go:117] "RemoveContainer" containerID="69942fe9b7c247af479fc33ab679c9eeacfc2afd1d7a9a16b586b7804c9beb17" Apr 24 16:44:18.788084 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.787779 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c54bd79dd-mzrps" Apr 24 16:44:18.807464 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.807441 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c54bd79dd-mzrps"] Apr 24 16:44:18.812799 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:18.812779 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c54bd79dd-mzrps"] Apr 24 16:44:19.857935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:44:19.857903 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89bc5cdc-1e79-447c-ac9d-917132630c70" path="/var/lib/kubelet/pods/89bc5cdc-1e79-447c-ac9d-917132630c70/volumes" Apr 24 16:45:19.922507 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.922474 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-ffbcc556b-96sbd"] Apr 24 16:45:19.922941 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.922712 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89bc5cdc-1e79-447c-ac9d-917132630c70" containerName="console" Apr 24 16:45:19.922941 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.922723 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bc5cdc-1e79-447c-ac9d-917132630c70" containerName="console" Apr 24 16:45:19.922941 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.922775 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="89bc5cdc-1e79-447c-ac9d-917132630c70" containerName="console" Apr 24 16:45:19.925685 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.925669 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:19.928038 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.928016 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:45:19.928186 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.928031 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:45:19.928258 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.928073 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:45:19.928312 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.928087 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:45:19.928312 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.928104 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:45:19.928517 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.928500 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ctfdb\"" Apr 24 16:45:19.928745 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.928724 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:45:19.928830 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.928727 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:45:19.935145 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.935094 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:45:19.937319 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:19.937285 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ffbcc556b-96sbd"] Apr 24 16:45:20.030265 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.030232 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-service-ca\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.030433 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.030270 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-console-config\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.030433 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.030333 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn8qq\" (UniqueName: \"kubernetes.io/projected/73330567-ec1a-4448-ade8-f58e81dc603d-kube-api-access-bn8qq\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.030433 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.030373 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-oauth-config\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.030588 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.030436 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-serving-cert\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.030588 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.030474 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-trusted-ca-bundle\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.030588 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.030504 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-oauth-serving-cert\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.130922 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.130891 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-serving-cert\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.131071 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.130932 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-trusted-ca-bundle\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.131071 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.130958 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-oauth-serving-cert\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.131071 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.131023 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-service-ca\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.131071 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.131059 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-console-config\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.131325 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.131097 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn8qq\" (UniqueName: \"kubernetes.io/projected/73330567-ec1a-4448-ade8-f58e81dc603d-kube-api-access-bn8qq\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.131325 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.131155 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-oauth-config\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.131750 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.131728 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-oauth-serving-cert\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.131954 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.131931 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-service-ca\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.132005 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.131956 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-console-config\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.132041 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.132016 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-trusted-ca-bundle\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.133510 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.133492 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-serving-cert\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.133617 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.133598 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-oauth-config\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.139145 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.139110 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn8qq\" (UniqueName: \"kubernetes.io/projected/73330567-ec1a-4448-ade8-f58e81dc603d-kube-api-access-bn8qq\") pod \"console-ffbcc556b-96sbd\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.235542 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.235505 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:20.355386 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.355350 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ffbcc556b-96sbd"] Apr 24 16:45:20.358240 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:45:20.358207 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73330567_ec1a_4448_ade8_f58e81dc603d.slice/crio-af7bd8b002029b9fe533a72965d7074afddeac93973beed815a99ea1c56b8329 WatchSource:0}: Error finding container af7bd8b002029b9fe533a72965d7074afddeac93973beed815a99ea1c56b8329: Status 404 returned error can't find the container with id af7bd8b002029b9fe533a72965d7074afddeac93973beed815a99ea1c56b8329 Apr 24 16:45:20.359945 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.359930 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:45:20.936332 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.936298 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ffbcc556b-96sbd" event={"ID":"73330567-ec1a-4448-ade8-f58e81dc603d","Type":"ContainerStarted","Data":"31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858"} Apr 24 16:45:20.936332 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.936333 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ffbcc556b-96sbd" event={"ID":"73330567-ec1a-4448-ade8-f58e81dc603d","Type":"ContainerStarted","Data":"af7bd8b002029b9fe533a72965d7074afddeac93973beed815a99ea1c56b8329"} Apr 24 16:45:20.956039 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:20.955995 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-ffbcc556b-96sbd" podStartSLOduration=1.95598091 podStartE2EDuration="1.95598091s" podCreationTimestamp="2026-04-24 16:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:45:20.95566038 +0000 UTC m=+379.626291282" watchObservedRunningTime="2026-04-24 16:45:20.95598091 +0000 UTC m=+379.626611811" Apr 24 16:45:30.235652 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:30.235612 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:30.235652 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:30.235660 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:30.240450 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:30.240421 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:45:30.960842 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:45:30.960814 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:46:12.172817 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.172785 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-9l6h2"] Apr 24 16:46:12.175678 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.175663 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:12.179635 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.179618 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 16:46:12.179976 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.179957 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 16:46:12.180058 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.180043 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 16:46:12.180675 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.180626 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 16:46:12.180777 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.180681 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-6jw28\"" Apr 24 16:46:12.189698 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.189679 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-9l6h2"] Apr 24 16:46:12.190886 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.190858 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4l2\" (UniqueName: \"kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-kube-api-access-lt4l2\") pod \"keda-admission-cf49989db-9l6h2\" (UID: \"047bc0a2-e495-4d4b-86e0-11f2fc9a0372\") " pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:12.190976 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.190946 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-certificates\") pod \"keda-admission-cf49989db-9l6h2\" (UID: \"047bc0a2-e495-4d4b-86e0-11f2fc9a0372\") " pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:12.291362 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.291329 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-certificates\") pod \"keda-admission-cf49989db-9l6h2\" (UID: \"047bc0a2-e495-4d4b-86e0-11f2fc9a0372\") " pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:12.291362 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.291373 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4l2\" (UniqueName: \"kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-kube-api-access-lt4l2\") pod \"keda-admission-cf49989db-9l6h2\" (UID: \"047bc0a2-e495-4d4b-86e0-11f2fc9a0372\") " pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:12.291629 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:46:12.291494 2563 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 16:46:12.291629 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:46:12.291521 2563 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-9l6h2: secret "keda-admission-webhooks-certs" not found Apr 24 16:46:12.291629 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:46:12.291584 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-certificates podName:047bc0a2-e495-4d4b-86e0-11f2fc9a0372 nodeName:}" failed. No retries permitted until 2026-04-24 16:46:12.791563252 +0000 UTC m=+431.462194137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-certificates") pod "keda-admission-cf49989db-9l6h2" (UID: "047bc0a2-e495-4d4b-86e0-11f2fc9a0372") : secret "keda-admission-webhooks-certs" not found Apr 24 16:46:12.300935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.300905 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4l2\" (UniqueName: \"kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-kube-api-access-lt4l2\") pod \"keda-admission-cf49989db-9l6h2\" (UID: \"047bc0a2-e495-4d4b-86e0-11f2fc9a0372\") " pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:12.794749 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.794711 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-certificates\") pod \"keda-admission-cf49989db-9l6h2\" (UID: \"047bc0a2-e495-4d4b-86e0-11f2fc9a0372\") " pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:12.797061 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:12.797039 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/047bc0a2-e495-4d4b-86e0-11f2fc9a0372-certificates\") pod \"keda-admission-cf49989db-9l6h2\" (UID: \"047bc0a2-e495-4d4b-86e0-11f2fc9a0372\") " pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:13.085287 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:13.085208 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:13.220738 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:13.220701 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-9l6h2"] Apr 24 16:46:13.225810 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:46:13.225782 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod047bc0a2_e495_4d4b_86e0_11f2fc9a0372.slice/crio-a638cd15e3550b52534891cf39b42eb59ffb823a94f691841f7cb6cb5f4b3f57 WatchSource:0}: Error finding container a638cd15e3550b52534891cf39b42eb59ffb823a94f691841f7cb6cb5f4b3f57: Status 404 returned error can't find the container with id a638cd15e3550b52534891cf39b42eb59ffb823a94f691841f7cb6cb5f4b3f57 Apr 24 16:46:14.067102 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:14.067064 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-9l6h2" event={"ID":"047bc0a2-e495-4d4b-86e0-11f2fc9a0372","Type":"ContainerStarted","Data":"a638cd15e3550b52534891cf39b42eb59ffb823a94f691841f7cb6cb5f4b3f57"} Apr 24 16:46:16.073672 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:16.073636 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-9l6h2" event={"ID":"047bc0a2-e495-4d4b-86e0-11f2fc9a0372","Type":"ContainerStarted","Data":"dda53e9a734bc981e686baa65a0e8ebbb669288558ef44c61b9267c605067b74"} Apr 24 16:46:16.074040 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:16.073791 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:46:16.093924 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:16.093880 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-9l6h2" podStartSLOduration=1.697380841 podStartE2EDuration="4.093867687s" podCreationTimestamp="2026-04-24 16:46:12 +0000 UTC" firstStartedPulling="2026-04-24 16:46:13.226960017 +0000 UTC m=+431.897590897" lastFinishedPulling="2026-04-24 16:46:15.623446861 +0000 UTC m=+434.294077743" observedRunningTime="2026-04-24 16:46:16.092532596 +0000 UTC m=+434.763163498" watchObservedRunningTime="2026-04-24 16:46:16.093867687 +0000 UTC m=+434.764498586" Apr 24 16:46:37.079061 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:46:37.078992 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-9l6h2" Apr 24 16:47:25.859667 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:25.859634 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-ztcwk"] Apr 24 16:47:25.861717 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:25.861703 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" Apr 24 16:47:25.864709 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:25.864677 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 24 16:47:25.865454 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:25.865428 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 24 16:47:25.865454 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:25.865429 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-fk4wb\"" Apr 24 16:47:25.876031 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:25.876005 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-ztcwk"] Apr 24 16:47:25.984887 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:25.984860 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1f8e754-d591-4fa7-89df-f725097609f3-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-ztcwk\" (UID: \"e1f8e754-d591-4fa7-89df-f725097609f3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" Apr 24 16:47:25.985029 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:25.984899 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8rmh\" (UniqueName: \"kubernetes.io/projected/e1f8e754-d591-4fa7-89df-f725097609f3-kube-api-access-t8rmh\") pod \"cert-manager-cainjector-68b757865b-ztcwk\" (UID: \"e1f8e754-d591-4fa7-89df-f725097609f3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" Apr 24 16:47:26.085980 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:26.085944 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1f8e754-d591-4fa7-89df-f725097609f3-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-ztcwk\" (UID: \"e1f8e754-d591-4fa7-89df-f725097609f3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" Apr 24 16:47:26.086117 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:26.085996 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8rmh\" (UniqueName: \"kubernetes.io/projected/e1f8e754-d591-4fa7-89df-f725097609f3-kube-api-access-t8rmh\") pod \"cert-manager-cainjector-68b757865b-ztcwk\" (UID: \"e1f8e754-d591-4fa7-89df-f725097609f3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" Apr 24 16:47:26.095175 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:26.095153 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1f8e754-d591-4fa7-89df-f725097609f3-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-ztcwk\" (UID: \"e1f8e754-d591-4fa7-89df-f725097609f3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" Apr 24 16:47:26.095376 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:26.095353 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8rmh\" (UniqueName: \"kubernetes.io/projected/e1f8e754-d591-4fa7-89df-f725097609f3-kube-api-access-t8rmh\") pod \"cert-manager-cainjector-68b757865b-ztcwk\" (UID: \"e1f8e754-d591-4fa7-89df-f725097609f3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" Apr 24 16:47:26.169668 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:26.169606 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" Apr 24 16:47:26.282565 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:26.282534 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-ztcwk"] Apr 24 16:47:26.285552 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:47:26.285526 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1f8e754_d591_4fa7_89df_f725097609f3.slice/crio-2bed23c76c48e6f2ab342953809c75002d137c914234da28f9f8ff3a507d96db WatchSource:0}: Error finding container 2bed23c76c48e6f2ab342953809c75002d137c914234da28f9f8ff3a507d96db: Status 404 returned error can't find the container with id 2bed23c76c48e6f2ab342953809c75002d137c914234da28f9f8ff3a507d96db Apr 24 16:47:27.258402 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:27.258359 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" event={"ID":"e1f8e754-d591-4fa7-89df-f725097609f3","Type":"ContainerStarted","Data":"2bed23c76c48e6f2ab342953809c75002d137c914234da28f9f8ff3a507d96db"} Apr 24 16:47:29.265913 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:29.265877 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" event={"ID":"e1f8e754-d591-4fa7-89df-f725097609f3","Type":"ContainerStarted","Data":"9df13fd1dae5dbd268d355d56fdd717cfe52095d3d91be44ad22b3c107996e74"} Apr 24 16:47:29.283610 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:29.283565 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-ztcwk" podStartSLOduration=1.398944615 podStartE2EDuration="4.283552706s" podCreationTimestamp="2026-04-24 16:47:25 +0000 UTC" firstStartedPulling="2026-04-24 16:47:26.287240777 +0000 UTC m=+504.957871656" lastFinishedPulling="2026-04-24 16:47:29.171848854 +0000 UTC m=+507.842479747" observedRunningTime="2026-04-24 16:47:29.282100923 +0000 UTC m=+507.952731860" watchObservedRunningTime="2026-04-24 16:47:29.283552706 +0000 UTC m=+507.954183607" Apr 24 16:47:41.021329 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.021295 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-gmjn8"] Apr 24 16:47:41.023464 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.023448 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-gmjn8" Apr 24 16:47:41.027582 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.027562 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-42fw2\"" Apr 24 16:47:41.047958 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.047923 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-gmjn8"] Apr 24 16:47:41.193347 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.193317 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kntd\" (UniqueName: \"kubernetes.io/projected/e4d228fd-bc1e-4885-8ffc-9065428f19b6-kube-api-access-7kntd\") pod \"cert-manager-79c8d999ff-gmjn8\" (UID: \"e4d228fd-bc1e-4885-8ffc-9065428f19b6\") " pod="cert-manager/cert-manager-79c8d999ff-gmjn8" Apr 24 16:47:41.193498 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.193357 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4d228fd-bc1e-4885-8ffc-9065428f19b6-bound-sa-token\") pod \"cert-manager-79c8d999ff-gmjn8\" (UID: \"e4d228fd-bc1e-4885-8ffc-9065428f19b6\") " pod="cert-manager/cert-manager-79c8d999ff-gmjn8" Apr 24 16:47:41.294514 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.294429 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4d228fd-bc1e-4885-8ffc-9065428f19b6-bound-sa-token\") pod \"cert-manager-79c8d999ff-gmjn8\" (UID: \"e4d228fd-bc1e-4885-8ffc-9065428f19b6\") " pod="cert-manager/cert-manager-79c8d999ff-gmjn8" Apr 24 16:47:41.294668 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.294512 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kntd\" (UniqueName: \"kubernetes.io/projected/e4d228fd-bc1e-4885-8ffc-9065428f19b6-kube-api-access-7kntd\") pod \"cert-manager-79c8d999ff-gmjn8\" (UID: \"e4d228fd-bc1e-4885-8ffc-9065428f19b6\") " pod="cert-manager/cert-manager-79c8d999ff-gmjn8" Apr 24 16:47:41.305556 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.305529 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e4d228fd-bc1e-4885-8ffc-9065428f19b6-bound-sa-token\") pod \"cert-manager-79c8d999ff-gmjn8\" (UID: \"e4d228fd-bc1e-4885-8ffc-9065428f19b6\") " pod="cert-manager/cert-manager-79c8d999ff-gmjn8" Apr 24 16:47:41.306828 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.306805 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kntd\" (UniqueName: \"kubernetes.io/projected/e4d228fd-bc1e-4885-8ffc-9065428f19b6-kube-api-access-7kntd\") pod \"cert-manager-79c8d999ff-gmjn8\" (UID: \"e4d228fd-bc1e-4885-8ffc-9065428f19b6\") " pod="cert-manager/cert-manager-79c8d999ff-gmjn8" Apr 24 16:47:41.331656 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.331637 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-gmjn8" Apr 24 16:47:41.470058 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:41.470033 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-gmjn8"] Apr 24 16:47:41.471675 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:47:41.471650 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4d228fd_bc1e_4885_8ffc_9065428f19b6.slice/crio-c674d7ae66c9fd96118d2dbb064751ac81130a54be3342966ef1ef4a734e252c WatchSource:0}: Error finding container c674d7ae66c9fd96118d2dbb064751ac81130a54be3342966ef1ef4a734e252c: Status 404 returned error can't find the container with id c674d7ae66c9fd96118d2dbb064751ac81130a54be3342966ef1ef4a734e252c Apr 24 16:47:42.300628 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:42.300589 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-gmjn8" event={"ID":"e4d228fd-bc1e-4885-8ffc-9065428f19b6","Type":"ContainerStarted","Data":"be3789cc0dd01318b8cda6e1dab3d0832a4b1f9d09aaa3aecfd168d10a712e06"} Apr 24 16:47:42.300628 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:42.300631 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-gmjn8" event={"ID":"e4d228fd-bc1e-4885-8ffc-9065428f19b6","Type":"ContainerStarted","Data":"c674d7ae66c9fd96118d2dbb064751ac81130a54be3342966ef1ef4a734e252c"} Apr 24 16:47:42.318650 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:47:42.318601 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-gmjn8" podStartSLOduration=2.3185862999999998 podStartE2EDuration="2.3185863s" podCreationTimestamp="2026-04-24 16:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:47:42.318094806 +0000 UTC m=+520.988725707" watchObservedRunningTime="2026-04-24 16:47:42.3185863 +0000 UTC m=+520.989217202" Apr 24 16:48:15.100191 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.100155 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds"] Apr 24 16:48:15.107104 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.107074 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.109629 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.109605 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 24 16:48:15.109736 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.109688 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 16:48:15.109736 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.109726 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 24 16:48:15.109841 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.109744 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 24 16:48:15.109841 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.109756 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 16:48:15.109841 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.109815 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 16:48:15.109992 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.109874 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-c5nzf\"" Apr 24 16:48:15.113890 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.113865 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds"] Apr 24 16:48:15.124529 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.124509 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.124624 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.124543 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.124624 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.124561 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.124624 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.124597 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489n2\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-kube-api-access-489n2\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.124729 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.124657 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.124729 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.124682 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.124729 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.124718 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9bb6899-ba48-46d5-a4a6-5c43217f7328-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.225643 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.225604 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.225913 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.225669 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.225913 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.225694 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.225913 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.225891 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-489n2\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-kube-api-access-489n2\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.226177 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.225969 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.226177 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.226008 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.226177 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.226073 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9bb6899-ba48-46d5-a4a6-5c43217f7328-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.226338 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.226306 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.228257 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.228234 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9bb6899-ba48-46d5-a4a6-5c43217f7328-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.228374 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.228253 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.228439 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.228402 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.228560 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.228544 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.236378 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.236347 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-489n2\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-kube-api-access-489n2\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.236919 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.236896 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-8lmds\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.416727 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.416631 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:15.546101 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:15.542606 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds"] Apr 24 16:48:15.548929 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:48:15.548900 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9bb6899_ba48_46d5_a4a6_5c43217f7328.slice/crio-68635c011b6e935348965e3ecab0f55c567dc9a9645586e2eb1ce2739cd0fd00 WatchSource:0}: Error finding container 68635c011b6e935348965e3ecab0f55c567dc9a9645586e2eb1ce2739cd0fd00: Status 404 returned error can't find the container with id 68635c011b6e935348965e3ecab0f55c567dc9a9645586e2eb1ce2739cd0fd00 Apr 24 16:48:16.391939 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:16.391900 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" event={"ID":"d9bb6899-ba48-46d5-a4a6-5c43217f7328","Type":"ContainerStarted","Data":"68635c011b6e935348965e3ecab0f55c567dc9a9645586e2eb1ce2739cd0fd00"} Apr 24 16:48:17.911545 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:17.911503 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 16:48:17.911775 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:17.911593 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 16:48:18.398843 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:18.398808 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" event={"ID":"d9bb6899-ba48-46d5-a4a6-5c43217f7328","Type":"ContainerStarted","Data":"7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89"} Apr 24 16:48:18.399033 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:18.399013 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:18.400582 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:18.400558 2563 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-8lmds container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 24 16:48:18.400682 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:18.400605 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" podUID="d9bb6899-ba48-46d5-a4a6-5c43217f7328" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:48:18.420354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:18.420317 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" podStartSLOduration=1.0596997400000001 podStartE2EDuration="3.420307482s" podCreationTimestamp="2026-04-24 16:48:15 +0000 UTC" firstStartedPulling="2026-04-24 16:48:15.550680884 +0000 UTC m=+554.221311763" lastFinishedPulling="2026-04-24 16:48:17.911288626 +0000 UTC m=+556.581919505" observedRunningTime="2026-04-24 16:48:18.419755542 +0000 UTC m=+557.090386443" watchObservedRunningTime="2026-04-24 16:48:18.420307482 +0000 UTC m=+557.090938383" Apr 24 16:48:19.402103 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:19.402076 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:48:41.671678 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.671645 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-x9t7n"] Apr 24 16:48:41.674759 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.674743 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" Apr 24 16:48:41.676992 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.676968 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 16:48:41.676992 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.676989 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 16:48:41.677172 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.677066 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-68fkm\"" Apr 24 16:48:41.683656 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.683630 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-x9t7n"] Apr 24 16:48:41.716715 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.716685 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2lx\" (UniqueName: \"kubernetes.io/projected/241305b7-c446-4de1-b201-1446823b4e25-kube-api-access-2q2lx\") pod \"authorino-operator-7587b89b76-x9t7n\" (UID: \"241305b7-c446-4de1-b201-1446823b4e25\") " pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" Apr 24 16:48:41.817387 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.817357 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2lx\" (UniqueName: \"kubernetes.io/projected/241305b7-c446-4de1-b201-1446823b4e25-kube-api-access-2q2lx\") pod \"authorino-operator-7587b89b76-x9t7n\" (UID: \"241305b7-c446-4de1-b201-1446823b4e25\") " pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" Apr 24 16:48:41.827063 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.827039 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2lx\" (UniqueName: \"kubernetes.io/projected/241305b7-c446-4de1-b201-1446823b4e25-kube-api-access-2q2lx\") pod \"authorino-operator-7587b89b76-x9t7n\" (UID: \"241305b7-c446-4de1-b201-1446823b4e25\") " pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" Apr 24 16:48:41.984443 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:41.984422 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" Apr 24 16:48:42.100211 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:42.100180 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-x9t7n"] Apr 24 16:48:42.103688 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:48:42.103662 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241305b7_c446_4de1_b201_1446823b4e25.slice/crio-1179d1ab881291c6482fe2e6da82e7dcef81654e2efce41e3410dd90c429c9cd WatchSource:0}: Error finding container 1179d1ab881291c6482fe2e6da82e7dcef81654e2efce41e3410dd90c429c9cd: Status 404 returned error can't find the container with id 1179d1ab881291c6482fe2e6da82e7dcef81654e2efce41e3410dd90c429c9cd Apr 24 16:48:42.464785 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:42.464743 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" event={"ID":"241305b7-c446-4de1-b201-1446823b4e25","Type":"ContainerStarted","Data":"1179d1ab881291c6482fe2e6da82e7dcef81654e2efce41e3410dd90c429c9cd"} Apr 24 16:48:43.398957 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.398915 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67dc464fdd-fvrg2"] Apr 24 16:48:43.404618 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.404580 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.405674 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.405645 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67dc464fdd-fvrg2"] Apr 24 16:48:43.530748 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.530714 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-service-ca\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.530748 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.530751 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-trusted-ca-bundle\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.531015 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.530795 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-serving-cert\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.531015 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.530811 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-config\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.531015 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.530836 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz2sb\" (UniqueName: \"kubernetes.io/projected/5e07f75a-06de-48fb-a96f-5df55fd55f5d-kube-api-access-nz2sb\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.531015 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.530926 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-oauth-serving-cert\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.531015 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.530966 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-oauth-config\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.631609 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.631575 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-oauth-serving-cert\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.631609 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.631618 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-oauth-config\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.631821 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.631676 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-service-ca\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.631821 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.631698 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-trusted-ca-bundle\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.631821 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.631752 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-serving-cert\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.631821 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.631777 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-config\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.631821 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.631800 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz2sb\" (UniqueName: \"kubernetes.io/projected/5e07f75a-06de-48fb-a96f-5df55fd55f5d-kube-api-access-nz2sb\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.632375 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.632346 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-oauth-serving-cert\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.632505 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.632468 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-config\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.632649 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.632624 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-trusted-ca-bundle\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.632744 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.632683 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e07f75a-06de-48fb-a96f-5df55fd55f5d-service-ca\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.634426 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.634409 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-serving-cert\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.634687 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.634665 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e07f75a-06de-48fb-a96f-5df55fd55f5d-console-oauth-config\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.642299 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.642269 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz2sb\" (UniqueName: \"kubernetes.io/projected/5e07f75a-06de-48fb-a96f-5df55fd55f5d-kube-api-access-nz2sb\") pod \"console-67dc464fdd-fvrg2\" (UID: \"5e07f75a-06de-48fb-a96f-5df55fd55f5d\") " pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:43.716100 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:43.716063 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:44.401125 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:44.401103 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67dc464fdd-fvrg2"] Apr 24 16:48:44.403788 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:48:44.403764 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e07f75a_06de_48fb_a96f_5df55fd55f5d.slice/crio-4199ddb1a181081605d1a550d54d7bc78abd33a854ee75222a433af46be1aa28 WatchSource:0}: Error finding container 4199ddb1a181081605d1a550d54d7bc78abd33a854ee75222a433af46be1aa28: Status 404 returned error can't find the container with id 4199ddb1a181081605d1a550d54d7bc78abd33a854ee75222a433af46be1aa28 Apr 24 16:48:44.473349 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:44.473303 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" event={"ID":"241305b7-c446-4de1-b201-1446823b4e25","Type":"ContainerStarted","Data":"98495292167d95ffe6bff0d738041fcbf9d4d3d0961fa81440d50cb02ee3b957"} Apr 24 16:48:44.473531 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:44.473509 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" Apr 24 16:48:44.475259 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:44.475235 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67dc464fdd-fvrg2" event={"ID":"5e07f75a-06de-48fb-a96f-5df55fd55f5d","Type":"ContainerStarted","Data":"16197c3fbb442694927f4980cdb0615cc1d13fbdcd070c4bb132285fd1497e82"} Apr 24 16:48:44.475259 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:44.475263 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67dc464fdd-fvrg2" event={"ID":"5e07f75a-06de-48fb-a96f-5df55fd55f5d","Type":"ContainerStarted","Data":"4199ddb1a181081605d1a550d54d7bc78abd33a854ee75222a433af46be1aa28"} Apr 24 16:48:44.498505 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:44.498461 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" podStartSLOduration=1.259822592 podStartE2EDuration="3.49844796s" podCreationTimestamp="2026-04-24 16:48:41 +0000 UTC" firstStartedPulling="2026-04-24 16:48:42.105679999 +0000 UTC m=+580.776310878" lastFinishedPulling="2026-04-24 16:48:44.34430535 +0000 UTC m=+583.014936246" observedRunningTime="2026-04-24 16:48:44.49711604 +0000 UTC m=+583.167746965" watchObservedRunningTime="2026-04-24 16:48:44.49844796 +0000 UTC m=+583.169078861" Apr 24 16:48:44.522151 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:44.522092 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67dc464fdd-fvrg2" podStartSLOduration=1.5220755480000001 podStartE2EDuration="1.522075548s" podCreationTimestamp="2026-04-24 16:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:48:44.521356936 +0000 UTC m=+583.191987837" watchObservedRunningTime="2026-04-24 16:48:44.522075548 +0000 UTC m=+583.192706449" Apr 24 16:48:45.183834 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.183794 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q"] Apr 24 16:48:45.187004 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.186988 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:45.189444 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.189421 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-b8m22\"" Apr 24 16:48:45.201044 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.201023 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q"] Apr 24 16:48:45.246363 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.246327 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bjv8q\" (UID: \"be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:45.246363 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.246368 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spp66\" (UniqueName: \"kubernetes.io/projected/be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3-kube-api-access-spp66\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bjv8q\" (UID: \"be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:45.347131 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.347100 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bjv8q\" (UID: \"be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:45.347131 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.347131 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spp66\" (UniqueName: \"kubernetes.io/projected/be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3-kube-api-access-spp66\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bjv8q\" (UID: \"be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:45.347471 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.347450 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bjv8q\" (UID: \"be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:45.361669 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.361638 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spp66\" (UniqueName: \"kubernetes.io/projected/be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3-kube-api-access-spp66\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-bjv8q\" (UID: \"be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:45.496220 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.496195 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:45.616660 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:45.616626 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q"] Apr 24 16:48:45.620885 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:48:45.620860 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe1d3f54_1ae3_44d1_ae31_a9f37c8a1ea3.slice/crio-cc292c2e2c42ba85a950769a3037a9bd88a909645c40b2c48a7b9eed6ceff94b WatchSource:0}: Error finding container cc292c2e2c42ba85a950769a3037a9bd88a909645c40b2c48a7b9eed6ceff94b: Status 404 returned error can't find the container with id cc292c2e2c42ba85a950769a3037a9bd88a909645c40b2c48a7b9eed6ceff94b Apr 24 16:48:46.481682 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:46.481646 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" event={"ID":"be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3","Type":"ContainerStarted","Data":"cc292c2e2c42ba85a950769a3037a9bd88a909645c40b2c48a7b9eed6ceff94b"} Apr 24 16:48:50.500990 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:50.500954 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" event={"ID":"be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3","Type":"ContainerStarted","Data":"cb7a178eac9a9baef54d41ed923512451fc8546e505cce65d8b5a00ab551e0e0"} Apr 24 16:48:50.501364 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:50.501059 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:48:50.521662 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:50.521613 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" podStartSLOduration=1.504553298 podStartE2EDuration="5.521599819s" podCreationTimestamp="2026-04-24 16:48:45 +0000 UTC" firstStartedPulling="2026-04-24 16:48:45.622964997 +0000 UTC m=+584.293595876" lastFinishedPulling="2026-04-24 16:48:49.640011505 +0000 UTC m=+588.310642397" observedRunningTime="2026-04-24 16:48:50.519513233 +0000 UTC m=+589.190144136" watchObservedRunningTime="2026-04-24 16:48:50.521599819 +0000 UTC m=+589.192230721" Apr 24 16:48:53.716353 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:53.716316 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:53.716353 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:53.716351 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:53.720950 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:53.720930 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:54.518641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:54.518610 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67dc464fdd-fvrg2" Apr 24 16:48:54.589469 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:54.589436 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ffbcc556b-96sbd"] Apr 24 16:48:55.480349 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:48:55.480320 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-x9t7n" Apr 24 16:49:01.507716 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:01.507682 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-bjv8q" Apr 24 16:49:19.608977 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:19.608914 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-ffbcc556b-96sbd" podUID="73330567-ec1a-4448-ade8-f58e81dc603d" containerName="console" containerID="cri-o://31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858" gracePeriod=15 Apr 24 16:49:19.860820 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:19.860772 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ffbcc556b-96sbd_73330567-ec1a-4448-ade8-f58e81dc603d/console/0.log" Apr 24 16:49:19.860918 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:19.860828 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:49:20.007735 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.007706 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-oauth-serving-cert\") pod \"73330567-ec1a-4448-ade8-f58e81dc603d\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " Apr 24 16:49:20.007899 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.007742 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-service-ca\") pod \"73330567-ec1a-4448-ade8-f58e81dc603d\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " Apr 24 16:49:20.007899 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.007757 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-console-config\") pod \"73330567-ec1a-4448-ade8-f58e81dc603d\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " Apr 24 16:49:20.007899 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.007791 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn8qq\" (UniqueName: \"kubernetes.io/projected/73330567-ec1a-4448-ade8-f58e81dc603d-kube-api-access-bn8qq\") pod \"73330567-ec1a-4448-ade8-f58e81dc603d\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " Apr 24 16:49:20.007899 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.007816 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-oauth-config\") pod \"73330567-ec1a-4448-ade8-f58e81dc603d\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " Apr 24 16:49:20.007899 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.007875 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-trusted-ca-bundle\") pod \"73330567-ec1a-4448-ade8-f58e81dc603d\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " Apr 24 16:49:20.008161 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.007902 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-serving-cert\") pod \"73330567-ec1a-4448-ade8-f58e81dc603d\" (UID: \"73330567-ec1a-4448-ade8-f58e81dc603d\") " Apr 24 16:49:20.008161 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.008122 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-service-ca" (OuterVolumeSpecName: "service-ca") pod "73330567-ec1a-4448-ade8-f58e81dc603d" (UID: "73330567-ec1a-4448-ade8-f58e81dc603d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:20.008294 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.008182 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-console-config" (OuterVolumeSpecName: "console-config") pod "73330567-ec1a-4448-ade8-f58e81dc603d" (UID: "73330567-ec1a-4448-ade8-f58e81dc603d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:20.008434 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.008342 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "73330567-ec1a-4448-ade8-f58e81dc603d" (UID: "73330567-ec1a-4448-ade8-f58e81dc603d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:20.008434 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.008363 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "73330567-ec1a-4448-ade8-f58e81dc603d" (UID: "73330567-ec1a-4448-ade8-f58e81dc603d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:20.010130 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.010101 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "73330567-ec1a-4448-ade8-f58e81dc603d" (UID: "73330567-ec1a-4448-ade8-f58e81dc603d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:49:20.010235 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.010158 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "73330567-ec1a-4448-ade8-f58e81dc603d" (UID: "73330567-ec1a-4448-ade8-f58e81dc603d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:49:20.010286 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.010263 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73330567-ec1a-4448-ade8-f58e81dc603d-kube-api-access-bn8qq" (OuterVolumeSpecName: "kube-api-access-bn8qq") pod "73330567-ec1a-4448-ade8-f58e81dc603d" (UID: "73330567-ec1a-4448-ade8-f58e81dc603d"). InnerVolumeSpecName "kube-api-access-bn8qq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:49:20.109367 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.109334 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-oauth-serving-cert\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:20.109367 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.109360 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-service-ca\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:20.109367 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.109371 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-console-config\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:20.109565 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.109380 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bn8qq\" (UniqueName: \"kubernetes.io/projected/73330567-ec1a-4448-ade8-f58e81dc603d-kube-api-access-bn8qq\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:20.109565 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.109389 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-oauth-config\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:20.109565 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.109398 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73330567-ec1a-4448-ade8-f58e81dc603d-trusted-ca-bundle\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:20.109565 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.109406 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/73330567-ec1a-4448-ade8-f58e81dc603d-console-serving-cert\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:20.599473 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.599392 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ffbcc556b-96sbd_73330567-ec1a-4448-ade8-f58e81dc603d/console/0.log" Apr 24 16:49:20.599473 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.599438 2563 generic.go:358] "Generic (PLEG): container finished" podID="73330567-ec1a-4448-ade8-f58e81dc603d" containerID="31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858" exitCode=2 Apr 24 16:49:20.599704 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.599501 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffbcc556b-96sbd" Apr 24 16:49:20.599704 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.599517 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ffbcc556b-96sbd" event={"ID":"73330567-ec1a-4448-ade8-f58e81dc603d","Type":"ContainerDied","Data":"31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858"} Apr 24 16:49:20.599704 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.599550 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ffbcc556b-96sbd" event={"ID":"73330567-ec1a-4448-ade8-f58e81dc603d","Type":"ContainerDied","Data":"af7bd8b002029b9fe533a72965d7074afddeac93973beed815a99ea1c56b8329"} Apr 24 16:49:20.599704 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.599571 2563 scope.go:117] "RemoveContainer" containerID="31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858" Apr 24 16:49:20.609373 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.609357 2563 scope.go:117] "RemoveContainer" containerID="31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858" Apr 24 16:49:20.609669 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:49:20.609628 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858\": container with ID starting with 31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858 not found: ID does not exist" containerID="31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858" Apr 24 16:49:20.609669 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.609652 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858"} err="failed to get container status \"31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858\": rpc error: code = NotFound desc = could not find container \"31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858\": container with ID starting with 31719586e890d2830cc2bf948062e0a1e8082300919712d8c3edd63b5a8d3858 not found: ID does not exist" Apr 24 16:49:20.621662 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.621640 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ffbcc556b-96sbd"] Apr 24 16:49:20.625802 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:20.625778 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-ffbcc556b-96sbd"] Apr 24 16:49:21.859924 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:21.859882 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73330567-ec1a-4448-ade8-f58e81dc603d" path="/var/lib/kubelet/pods/73330567-ec1a-4448-ade8-f58e81dc603d/volumes" Apr 24 16:49:34.936745 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:34.936712 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-llxx6"] Apr 24 16:49:34.937333 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:34.937128 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73330567-ec1a-4448-ade8-f58e81dc603d" containerName="console" Apr 24 16:49:34.937333 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:34.937165 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="73330567-ec1a-4448-ade8-f58e81dc603d" containerName="console" Apr 24 16:49:34.937333 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:34.937251 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="73330567-ec1a-4448-ade8-f58e81dc603d" containerName="console" Apr 24 16:49:34.940223 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:34.940203 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:34.942806 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:34.942776 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-frjpv\"" Apr 24 16:49:34.942922 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:34.942819 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 16:49:34.948845 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:34.948825 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-llxx6"] Apr 24 16:49:35.014191 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.014130 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpg6q\" (UniqueName: \"kubernetes.io/projected/9e3ea280-7fc4-4adb-894d-b75ee6188aad-kube-api-access-xpg6q\") pod \"limitador-limitador-64c8f475fb-llxx6\" (UID: \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:35.014354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.014221 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9e3ea280-7fc4-4adb-894d-b75ee6188aad-config-file\") pod \"limitador-limitador-64c8f475fb-llxx6\" (UID: \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:35.036663 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.036631 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-llxx6"] Apr 24 16:49:35.115016 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.114987 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpg6q\" (UniqueName: \"kubernetes.io/projected/9e3ea280-7fc4-4adb-894d-b75ee6188aad-kube-api-access-xpg6q\") pod \"limitador-limitador-64c8f475fb-llxx6\" (UID: \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:35.115217 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.115040 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9e3ea280-7fc4-4adb-894d-b75ee6188aad-config-file\") pod \"limitador-limitador-64c8f475fb-llxx6\" (UID: \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:35.115634 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.115613 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9e3ea280-7fc4-4adb-894d-b75ee6188aad-config-file\") pod \"limitador-limitador-64c8f475fb-llxx6\" (UID: \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:35.133645 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.133620 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpg6q\" (UniqueName: \"kubernetes.io/projected/9e3ea280-7fc4-4adb-894d-b75ee6188aad-kube-api-access-xpg6q\") pod \"limitador-limitador-64c8f475fb-llxx6\" (UID: \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:35.250853 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.250822 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:35.371651 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.371616 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-llxx6"] Apr 24 16:49:35.374679 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:49:35.374649 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e3ea280_7fc4_4adb_894d_b75ee6188aad.slice/crio-c9c1a3f52560bc779b1742f4f9d1f14e595f69a22526f637ccf96a8dbb6a6863 WatchSource:0}: Error finding container c9c1a3f52560bc779b1742f4f9d1f14e595f69a22526f637ccf96a8dbb6a6863: Status 404 returned error can't find the container with id c9c1a3f52560bc779b1742f4f9d1f14e595f69a22526f637ccf96a8dbb6a6863 Apr 24 16:49:35.647981 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:35.647904 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" event={"ID":"9e3ea280-7fc4-4adb-894d-b75ee6188aad","Type":"ContainerStarted","Data":"c9c1a3f52560bc779b1742f4f9d1f14e595f69a22526f637ccf96a8dbb6a6863"} Apr 24 16:49:39.664246 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:39.664209 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" event={"ID":"9e3ea280-7fc4-4adb-894d-b75ee6188aad","Type":"ContainerStarted","Data":"bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931"} Apr 24 16:49:39.664595 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:39.664349 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:39.682838 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:39.682791 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" podStartSLOduration=1.6213794639999999 podStartE2EDuration="5.682772813s" podCreationTimestamp="2026-04-24 16:49:34 +0000 UTC" firstStartedPulling="2026-04-24 16:49:35.376473421 +0000 UTC m=+634.047104300" lastFinishedPulling="2026-04-24 16:49:39.43786676 +0000 UTC m=+638.108497649" observedRunningTime="2026-04-24 16:49:39.681905129 +0000 UTC m=+638.352536041" watchObservedRunningTime="2026-04-24 16:49:39.682772813 +0000 UTC m=+638.353403715" Apr 24 16:49:48.939089 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:48.939046 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-llxx6"] Apr 24 16:49:48.939675 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:48.939362 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" podUID="9e3ea280-7fc4-4adb-894d-b75ee6188aad" containerName="limitador" containerID="cri-o://bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931" gracePeriod=30 Apr 24 16:49:48.940116 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:48.939988 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:49.470543 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.470521 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:49.524150 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.524066 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9e3ea280-7fc4-4adb-894d-b75ee6188aad-config-file\") pod \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\" (UID: \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\") " Apr 24 16:49:49.524150 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.524102 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpg6q\" (UniqueName: \"kubernetes.io/projected/9e3ea280-7fc4-4adb-894d-b75ee6188aad-kube-api-access-xpg6q\") pod \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\" (UID: \"9e3ea280-7fc4-4adb-894d-b75ee6188aad\") " Apr 24 16:49:49.524500 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.524474 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e3ea280-7fc4-4adb-894d-b75ee6188aad-config-file" (OuterVolumeSpecName: "config-file") pod "9e3ea280-7fc4-4adb-894d-b75ee6188aad" (UID: "9e3ea280-7fc4-4adb-894d-b75ee6188aad"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:49.526255 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.526234 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3ea280-7fc4-4adb-894d-b75ee6188aad-kube-api-access-xpg6q" (OuterVolumeSpecName: "kube-api-access-xpg6q") pod "9e3ea280-7fc4-4adb-894d-b75ee6188aad" (UID: "9e3ea280-7fc4-4adb-894d-b75ee6188aad"). InnerVolumeSpecName "kube-api-access-xpg6q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:49:49.625290 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.625256 2563 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9e3ea280-7fc4-4adb-894d-b75ee6188aad-config-file\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:49.625290 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.625285 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xpg6q\" (UniqueName: \"kubernetes.io/projected/9e3ea280-7fc4-4adb-894d-b75ee6188aad-kube-api-access-xpg6q\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:49:49.695497 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.695461 2563 generic.go:358] "Generic (PLEG): container finished" podID="9e3ea280-7fc4-4adb-894d-b75ee6188aad" containerID="bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931" exitCode=0 Apr 24 16:49:49.695623 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.695503 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" event={"ID":"9e3ea280-7fc4-4adb-894d-b75ee6188aad","Type":"ContainerDied","Data":"bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931"} Apr 24 16:49:49.695623 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.695527 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" Apr 24 16:49:49.695623 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.695543 2563 scope.go:117] "RemoveContainer" containerID="bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931" Apr 24 16:49:49.695764 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.695529 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-llxx6" event={"ID":"9e3ea280-7fc4-4adb-894d-b75ee6188aad","Type":"ContainerDied","Data":"c9c1a3f52560bc779b1742f4f9d1f14e595f69a22526f637ccf96a8dbb6a6863"} Apr 24 16:49:49.704217 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.704196 2563 scope.go:117] "RemoveContainer" containerID="bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931" Apr 24 16:49:49.704497 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:49:49.704472 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931\": container with ID starting with bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931 not found: ID does not exist" containerID="bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931" Apr 24 16:49:49.704561 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.704509 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931"} err="failed to get container status \"bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931\": rpc error: code = NotFound desc = could not find container \"bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931\": container with ID starting with bdd2fdb449ec54a3e7fd86f9f1c7f95a8c6f30ee8af1863c6663ff3be925e931 not found: ID does not exist" Apr 24 16:49:49.720617 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.720595 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-llxx6"] Apr 24 16:49:49.723648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.723626 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-llxx6"] Apr 24 16:49:49.858746 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:49:49.858674 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3ea280-7fc4-4adb-894d-b75ee6188aad" path="/var/lib/kubelet/pods/9e3ea280-7fc4-4adb-894d-b75ee6188aad/volumes" Apr 24 16:50:08.852619 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.852585 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd"] Apr 24 16:50:08.853082 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.852827 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e3ea280-7fc4-4adb-894d-b75ee6188aad" containerName="limitador" Apr 24 16:50:08.853082 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.852838 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3ea280-7fc4-4adb-894d-b75ee6188aad" containerName="limitador" Apr 24 16:50:08.853082 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.852890 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e3ea280-7fc4-4adb-894d-b75ee6188aad" containerName="limitador" Apr 24 16:50:08.856935 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.856916 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:08.868966 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.868944 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd"] Apr 24 16:50:08.959982 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.959946 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:08.959982 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.959986 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:08.960247 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.960008 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:08.960247 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.960050 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/96a57a22-0f7f-4408-bbaf-0671067e8c5d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:08.960247 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.960099 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:08.960247 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.960205 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q75s2\" (UniqueName: \"kubernetes.io/projected/96a57a22-0f7f-4408-bbaf-0671067e8c5d-kube-api-access-q75s2\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:08.960247 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:08.960237 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.061275 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.061244 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/96a57a22-0f7f-4408-bbaf-0671067e8c5d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.061435 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.061289 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.061435 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.061329 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q75s2\" (UniqueName: \"kubernetes.io/projected/96a57a22-0f7f-4408-bbaf-0671067e8c5d-kube-api-access-q75s2\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.061435 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.061347 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.061435 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.061375 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.061627 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.061459 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.061627 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.061506 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.062126 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.062100 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.063671 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.063645 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/96a57a22-0f7f-4408-bbaf-0671067e8c5d-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.063805 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.063787 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.064070 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.064053 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.064197 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.064176 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.079924 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.079894 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/96a57a22-0f7f-4408-bbaf-0671067e8c5d-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.080022 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.080005 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q75s2\" (UniqueName: \"kubernetes.io/projected/96a57a22-0f7f-4408-bbaf-0671067e8c5d-kube-api-access-q75s2\") pod \"istiod-openshift-gateway-55ff986f96-8pfrd\" (UID: \"96a57a22-0f7f-4408-bbaf-0671067e8c5d\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.166149 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.166077 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.306233 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.306209 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd"] Apr 24 16:50:09.307586 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:50:09.307562 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a57a22_0f7f_4408_bbaf_0671067e8c5d.slice/crio-90cdc5741ab5e3695974909712f2bbf3682883ecc7b16a55cf5ca99c6cadc589 WatchSource:0}: Error finding container 90cdc5741ab5e3695974909712f2bbf3682883ecc7b16a55cf5ca99c6cadc589: Status 404 returned error can't find the container with id 90cdc5741ab5e3695974909712f2bbf3682883ecc7b16a55cf5ca99c6cadc589 Apr 24 16:50:09.309509 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.309476 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 16:50:09.309631 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.309555 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 16:50:09.757013 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.756978 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" event={"ID":"96a57a22-0f7f-4408-bbaf-0671067e8c5d","Type":"ContainerStarted","Data":"5cf1fec5b05d6060ae7277dbe9eb13481b9ca1603cd07d736d1b78604e7bca25"} Apr 24 16:50:09.757013 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.757015 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" event={"ID":"96a57a22-0f7f-4408-bbaf-0671067e8c5d","Type":"ContainerStarted","Data":"90cdc5741ab5e3695974909712f2bbf3682883ecc7b16a55cf5ca99c6cadc589"} Apr 24 16:50:09.757295 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.757040 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:09.799412 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:09.799368 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" podStartSLOduration=1.799353319 podStartE2EDuration="1.799353319s" podCreationTimestamp="2026-04-24 16:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:50:09.797763263 +0000 UTC m=+668.468394177" watchObservedRunningTime="2026-04-24 16:50:09.799353319 +0000 UTC m=+668.469984218" Apr 24 16:50:10.761672 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:10.761641 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8pfrd" Apr 24 16:50:10.842570 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:10.842533 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds"] Apr 24 16:50:10.842817 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:10.842768 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" podUID="d9bb6899-ba48-46d5-a4a6-5c43217f7328" containerName="discovery" containerID="cri-o://7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89" gracePeriod=30 Apr 24 16:50:11.096377 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.096357 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:50:11.180876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.180845 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-489n2\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-kube-api-access-489n2\") pod \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " Apr 24 16:50:11.181030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.180896 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9bb6899-ba48-46d5-a4a6-5c43217f7328-local-certs\") pod \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " Apr 24 16:50:11.181030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.180973 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-dns-cert\") pod \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " Apr 24 16:50:11.181030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.180998 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-ca-configmap\") pod \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " Apr 24 16:50:11.181030 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.181023 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-token\") pod \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " Apr 24 16:50:11.181252 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.181053 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-cacerts\") pod \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " Apr 24 16:50:11.181252 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.181111 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-kubeconfig\") pod \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\" (UID: \"d9bb6899-ba48-46d5-a4a6-5c43217f7328\") " Apr 24 16:50:11.181852 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.181811 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "d9bb6899-ba48-46d5-a4a6-5c43217f7328" (UID: "d9bb6899-ba48-46d5-a4a6-5c43217f7328"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:50:11.183531 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.183486 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-token" (OuterVolumeSpecName: "istio-token") pod "d9bb6899-ba48-46d5-a4a6-5c43217f7328" (UID: "d9bb6899-ba48-46d5-a4a6-5c43217f7328"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:50:11.183722 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.183663 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-kube-api-access-489n2" (OuterVolumeSpecName: "kube-api-access-489n2") pod "d9bb6899-ba48-46d5-a4a6-5c43217f7328" (UID: "d9bb6899-ba48-46d5-a4a6-5c43217f7328"). InnerVolumeSpecName "kube-api-access-489n2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:50:11.183722 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.183695 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9bb6899-ba48-46d5-a4a6-5c43217f7328-local-certs" (OuterVolumeSpecName: "local-certs") pod "d9bb6899-ba48-46d5-a4a6-5c43217f7328" (UID: "d9bb6899-ba48-46d5-a4a6-5c43217f7328"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:50:11.183864 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.183717 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "d9bb6899-ba48-46d5-a4a6-5c43217f7328" (UID: "d9bb6899-ba48-46d5-a4a6-5c43217f7328"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:50:11.183864 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.183738 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "d9bb6899-ba48-46d5-a4a6-5c43217f7328" (UID: "d9bb6899-ba48-46d5-a4a6-5c43217f7328"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:50:11.183979 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.183963 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-cacerts" (OuterVolumeSpecName: "cacerts") pod "d9bb6899-ba48-46d5-a4a6-5c43217f7328" (UID: "d9bb6899-ba48-46d5-a4a6-5c43217f7328"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:50:11.281822 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.281754 2563 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-kubeconfig\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:11.281822 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.281789 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-489n2\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-kube-api-access-489n2\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:11.281822 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.281800 2563 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/d9bb6899-ba48-46d5-a4a6-5c43217f7328-local-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:11.281822 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.281809 2563 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-dns-cert\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:11.281822 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.281818 2563 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-csr-ca-configmap\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:11.281822 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.281827 2563 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d9bb6899-ba48-46d5-a4a6-5c43217f7328-istio-token\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:11.282071 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.281836 2563 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/d9bb6899-ba48-46d5-a4a6-5c43217f7328-cacerts\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:11.764568 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.764529 2563 generic.go:358] "Generic (PLEG): container finished" podID="d9bb6899-ba48-46d5-a4a6-5c43217f7328" containerID="7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89" exitCode=0 Apr 24 16:50:11.764951 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.764582 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" Apr 24 16:50:11.764951 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.764602 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" event={"ID":"d9bb6899-ba48-46d5-a4a6-5c43217f7328","Type":"ContainerDied","Data":"7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89"} Apr 24 16:50:11.764951 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.764638 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds" event={"ID":"d9bb6899-ba48-46d5-a4a6-5c43217f7328","Type":"ContainerDied","Data":"68635c011b6e935348965e3ecab0f55c567dc9a9645586e2eb1ce2739cd0fd00"} Apr 24 16:50:11.764951 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.764654 2563 scope.go:117] "RemoveContainer" containerID="7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89" Apr 24 16:50:11.772731 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.772670 2563 scope.go:117] "RemoveContainer" containerID="7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89" Apr 24 16:50:11.772961 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:50:11.772937 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89\": container with ID starting with 7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89 not found: ID does not exist" containerID="7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89" Apr 24 16:50:11.773046 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.772965 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89"} err="failed to get container status \"7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89\": rpc error: code = NotFound desc = could not find container \"7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89\": container with ID starting with 7741232101e80cdaffae641fcc8acb95738a1613e6e193bc7387ba176184ac89 not found: ID does not exist" Apr 24 16:50:11.792164 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.792117 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds"] Apr 24 16:50:11.800862 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.800837 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-8lmds"] Apr 24 16:50:11.861394 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:11.861364 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9bb6899-ba48-46d5-a4a6-5c43217f7328" path="/var/lib/kubelet/pods/d9bb6899-ba48-46d5-a4a6-5c43217f7328/volumes" Apr 24 16:50:19.339186 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.339154 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-n4ctx"] Apr 24 16:50:19.339624 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.339466 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9bb6899-ba48-46d5-a4a6-5c43217f7328" containerName="discovery" Apr 24 16:50:19.339624 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.339480 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bb6899-ba48-46d5-a4a6-5c43217f7328" containerName="discovery" Apr 24 16:50:19.339624 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.339529 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9bb6899-ba48-46d5-a4a6-5c43217f7328" containerName="discovery" Apr 24 16:50:19.352658 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.352621 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-n4ctx"] Apr 24 16:50:19.352792 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.352694 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:19.355855 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.355829 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-qwfvj\"" Apr 24 16:50:19.355987 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.355958 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 16:50:19.356388 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.356370 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:50:19.356501 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.356372 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:50:19.362746 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.362726 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth"] Apr 24 16:50:19.364814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.364794 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:19.366861 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.366839 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-nrxvf\"" Apr 24 16:50:19.366967 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.366882 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 16:50:19.374755 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.374731 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth"] Apr 24 16:50:19.403872 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.403847 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-4267n"] Apr 24 16:50:19.405708 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.405694 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:19.409173 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.409153 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-7lswg\"" Apr 24 16:50:19.409368 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.409354 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 16:50:19.426174 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.426121 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-4267n"] Apr 24 16:50:19.448603 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.448579 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvhvq\" (UniqueName: \"kubernetes.io/projected/18abb458-fa75-4012-8d1c-81887d5cf548-kube-api-access-nvhvq\") pod \"kserve-controller-manager-7f7fb4c66f-n4ctx\" (UID: \"18abb458-fa75-4012-8d1c-81887d5cf548\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:19.448783 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.448764 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtv2k\" (UniqueName: \"kubernetes.io/projected/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-kube-api-access-vtv2k\") pod \"llmisvc-controller-manager-5dcd86f4cc-8gjth\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:19.448924 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.448907 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert\") pod \"llmisvc-controller-manager-5dcd86f4cc-8gjth\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:19.449003 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.448953 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18abb458-fa75-4012-8d1c-81887d5cf548-cert\") pod \"kserve-controller-manager-7f7fb4c66f-n4ctx\" (UID: \"18abb458-fa75-4012-8d1c-81887d5cf548\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:19.549876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.549840 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvhvq\" (UniqueName: \"kubernetes.io/projected/18abb458-fa75-4012-8d1c-81887d5cf548-kube-api-access-nvhvq\") pod \"kserve-controller-manager-7f7fb4c66f-n4ctx\" (UID: \"18abb458-fa75-4012-8d1c-81887d5cf548\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:19.550035 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.549888 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-547xp\" (UniqueName: \"kubernetes.io/projected/b407162a-89ff-4f61-8939-6b050381662f-kube-api-access-547xp\") pod \"seaweedfs-86cc847c5c-4267n\" (UID: \"b407162a-89ff-4f61-8939-6b050381662f\") " pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:19.550035 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.549920 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtv2k\" (UniqueName: \"kubernetes.io/projected/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-kube-api-access-vtv2k\") pod \"llmisvc-controller-manager-5dcd86f4cc-8gjth\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:19.550035 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.549943 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b407162a-89ff-4f61-8939-6b050381662f-data\") pod \"seaweedfs-86cc847c5c-4267n\" (UID: \"b407162a-89ff-4f61-8939-6b050381662f\") " pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:19.550035 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.549997 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert\") pod \"llmisvc-controller-manager-5dcd86f4cc-8gjth\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:19.550035 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.550035 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18abb458-fa75-4012-8d1c-81887d5cf548-cert\") pod \"kserve-controller-manager-7f7fb4c66f-n4ctx\" (UID: \"18abb458-fa75-4012-8d1c-81887d5cf548\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:19.550301 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:50:19.550075 2563 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 24 16:50:19.550301 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:50:19.550155 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert podName:a8fffed1-71d1-4b6e-be08-88db6fedf6a7 nodeName:}" failed. No retries permitted until 2026-04-24 16:50:20.050120446 +0000 UTC m=+678.720751324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert") pod "llmisvc-controller-manager-5dcd86f4cc-8gjth" (UID: "a8fffed1-71d1-4b6e-be08-88db6fedf6a7") : secret "llmisvc-webhook-server-cert" not found Apr 24 16:50:19.552423 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.552397 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18abb458-fa75-4012-8d1c-81887d5cf548-cert\") pod \"kserve-controller-manager-7f7fb4c66f-n4ctx\" (UID: \"18abb458-fa75-4012-8d1c-81887d5cf548\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:19.566314 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.566291 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvhvq\" (UniqueName: \"kubernetes.io/projected/18abb458-fa75-4012-8d1c-81887d5cf548-kube-api-access-nvhvq\") pod \"kserve-controller-manager-7f7fb4c66f-n4ctx\" (UID: \"18abb458-fa75-4012-8d1c-81887d5cf548\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:19.569816 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.569793 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtv2k\" (UniqueName: \"kubernetes.io/projected/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-kube-api-access-vtv2k\") pod \"llmisvc-controller-manager-5dcd86f4cc-8gjth\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:19.651026 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.650952 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-547xp\" (UniqueName: \"kubernetes.io/projected/b407162a-89ff-4f61-8939-6b050381662f-kube-api-access-547xp\") pod \"seaweedfs-86cc847c5c-4267n\" (UID: \"b407162a-89ff-4f61-8939-6b050381662f\") " pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:19.651026 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.650987 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b407162a-89ff-4f61-8939-6b050381662f-data\") pod \"seaweedfs-86cc847c5c-4267n\" (UID: \"b407162a-89ff-4f61-8939-6b050381662f\") " pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:19.651370 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.651354 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b407162a-89ff-4f61-8939-6b050381662f-data\") pod \"seaweedfs-86cc847c5c-4267n\" (UID: \"b407162a-89ff-4f61-8939-6b050381662f\") " pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:19.660096 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.660075 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-547xp\" (UniqueName: \"kubernetes.io/projected/b407162a-89ff-4f61-8939-6b050381662f-kube-api-access-547xp\") pod \"seaweedfs-86cc847c5c-4267n\" (UID: \"b407162a-89ff-4f61-8939-6b050381662f\") " pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:19.663894 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.663879 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:19.714786 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.714742 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:19.791358 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.791260 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-n4ctx"] Apr 24 16:50:19.794614 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:50:19.794566 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18abb458_fa75_4012_8d1c_81887d5cf548.slice/crio-1d01deaad6d38e72a76fbeba3e4bc206888dd9c84f86f3eb707921152625fd65 WatchSource:0}: Error finding container 1d01deaad6d38e72a76fbeba3e4bc206888dd9c84f86f3eb707921152625fd65: Status 404 returned error can't find the container with id 1d01deaad6d38e72a76fbeba3e4bc206888dd9c84f86f3eb707921152625fd65 Apr 24 16:50:19.837728 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:19.837703 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-4267n"] Apr 24 16:50:19.841474 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:50:19.841448 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb407162a_89ff_4f61_8939_6b050381662f.slice/crio-90ecb52ba4092f799c03f6681557f8815213f6963d33f574351e6da2bbfc4d21 WatchSource:0}: Error finding container 90ecb52ba4092f799c03f6681557f8815213f6963d33f574351e6da2bbfc4d21: Status 404 returned error can't find the container with id 90ecb52ba4092f799c03f6681557f8815213f6963d33f574351e6da2bbfc4d21 Apr 24 16:50:20.054819 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:20.054785 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert\") pod \"llmisvc-controller-manager-5dcd86f4cc-8gjth\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:20.057392 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:20.057372 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert\") pod \"llmisvc-controller-manager-5dcd86f4cc-8gjth\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:20.274048 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:20.273978 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:20.458915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:20.458887 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth"] Apr 24 16:50:20.462629 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:50:20.462419 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda8fffed1_71d1_4b6e_be08_88db6fedf6a7.slice/crio-769ca6e88bd01aef828472e419c34b36b4bac6c161c3a603a48ee54ee56cedf0 WatchSource:0}: Error finding container 769ca6e88bd01aef828472e419c34b36b4bac6c161c3a603a48ee54ee56cedf0: Status 404 returned error can't find the container with id 769ca6e88bd01aef828472e419c34b36b4bac6c161c3a603a48ee54ee56cedf0 Apr 24 16:50:20.464492 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:20.464395 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:50:20.796068 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:20.796024 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" event={"ID":"a8fffed1-71d1-4b6e-be08-88db6fedf6a7","Type":"ContainerStarted","Data":"769ca6e88bd01aef828472e419c34b36b4bac6c161c3a603a48ee54ee56cedf0"} Apr 24 16:50:20.798047 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:20.797997 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-4267n" event={"ID":"b407162a-89ff-4f61-8939-6b050381662f","Type":"ContainerStarted","Data":"90ecb52ba4092f799c03f6681557f8815213f6963d33f574351e6da2bbfc4d21"} Apr 24 16:50:20.799541 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:20.799497 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" event={"ID":"18abb458-fa75-4012-8d1c-81887d5cf548","Type":"ContainerStarted","Data":"1d01deaad6d38e72a76fbeba3e4bc206888dd9c84f86f3eb707921152625fd65"} Apr 24 16:50:24.814618 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:24.814581 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-4267n" event={"ID":"b407162a-89ff-4f61-8939-6b050381662f","Type":"ContainerStarted","Data":"002b0fbcbaaa609e230b7f91d58271a338a5456dfe4822ead8cf789003828231"} Apr 24 16:50:24.815048 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:24.814643 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:24.815852 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:24.815829 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" event={"ID":"18abb458-fa75-4012-8d1c-81887d5cf548","Type":"ContainerStarted","Data":"0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b"} Apr 24 16:50:24.815953 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:24.815931 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:24.830356 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:24.830316 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-4267n" podStartSLOduration=1.759862872 podStartE2EDuration="5.830304065s" podCreationTimestamp="2026-04-24 16:50:19 +0000 UTC" firstStartedPulling="2026-04-24 16:50:19.842740076 +0000 UTC m=+678.513370955" lastFinishedPulling="2026-04-24 16:50:23.913181261 +0000 UTC m=+682.583812148" observedRunningTime="2026-04-24 16:50:24.82939632 +0000 UTC m=+683.500027220" watchObservedRunningTime="2026-04-24 16:50:24.830304065 +0000 UTC m=+683.500934993" Apr 24 16:50:25.821001 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:25.820964 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" event={"ID":"a8fffed1-71d1-4b6e-be08-88db6fedf6a7","Type":"ContainerStarted","Data":"ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2"} Apr 24 16:50:25.821445 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:25.821126 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:25.838474 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:25.838423 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" podStartSLOduration=3.252825392 podStartE2EDuration="6.838410436s" podCreationTimestamp="2026-04-24 16:50:19 +0000 UTC" firstStartedPulling="2026-04-24 16:50:19.79590357 +0000 UTC m=+678.466534449" lastFinishedPulling="2026-04-24 16:50:23.381488608 +0000 UTC m=+682.052119493" observedRunningTime="2026-04-24 16:50:24.855378722 +0000 UTC m=+683.526009623" watchObservedRunningTime="2026-04-24 16:50:25.838410436 +0000 UTC m=+684.509041336" Apr 24 16:50:25.838642 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:25.838503 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" podStartSLOduration=1.677048833 podStartE2EDuration="6.838499617s" podCreationTimestamp="2026-04-24 16:50:19 +0000 UTC" firstStartedPulling="2026-04-24 16:50:20.464570742 +0000 UTC m=+679.135201622" lastFinishedPulling="2026-04-24 16:50:25.626021519 +0000 UTC m=+684.296652406" observedRunningTime="2026-04-24 16:50:25.837276648 +0000 UTC m=+684.507907549" watchObservedRunningTime="2026-04-24 16:50:25.838499617 +0000 UTC m=+684.509130518" Apr 24 16:50:30.823469 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:30.823434 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-4267n" Apr 24 16:50:55.825984 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:55.825896 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:56.826284 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:56.826244 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 16:50:58.124298 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.124261 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-n4ctx"] Apr 24 16:50:58.124682 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.124471 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" podUID="18abb458-fa75-4012-8d1c-81887d5cf548" containerName="manager" containerID="cri-o://0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b" gracePeriod=10 Apr 24 16:50:58.155339 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.155314 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-9zdjk"] Apr 24 16:50:58.158542 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.158522 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:58.166777 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.166757 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-9zdjk"] Apr 24 16:50:58.246020 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.245996 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e244dc29-8bf9-41c2-8af1-fd6e2a17fd35-cert\") pod \"kserve-controller-manager-7f7fb4c66f-9zdjk\" (UID: \"e244dc29-8bf9-41c2-8af1-fd6e2a17fd35\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:58.246172 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.246076 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzp4j\" (UniqueName: \"kubernetes.io/projected/e244dc29-8bf9-41c2-8af1-fd6e2a17fd35-kube-api-access-dzp4j\") pod \"kserve-controller-manager-7f7fb4c66f-9zdjk\" (UID: \"e244dc29-8bf9-41c2-8af1-fd6e2a17fd35\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:58.346852 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.346816 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e244dc29-8bf9-41c2-8af1-fd6e2a17fd35-cert\") pod \"kserve-controller-manager-7f7fb4c66f-9zdjk\" (UID: \"e244dc29-8bf9-41c2-8af1-fd6e2a17fd35\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:58.347016 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.346895 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzp4j\" (UniqueName: \"kubernetes.io/projected/e244dc29-8bf9-41c2-8af1-fd6e2a17fd35-kube-api-access-dzp4j\") pod \"kserve-controller-manager-7f7fb4c66f-9zdjk\" (UID: \"e244dc29-8bf9-41c2-8af1-fd6e2a17fd35\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:58.349284 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.349263 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e244dc29-8bf9-41c2-8af1-fd6e2a17fd35-cert\") pod \"kserve-controller-manager-7f7fb4c66f-9zdjk\" (UID: \"e244dc29-8bf9-41c2-8af1-fd6e2a17fd35\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:58.355354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.355333 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzp4j\" (UniqueName: \"kubernetes.io/projected/e244dc29-8bf9-41c2-8af1-fd6e2a17fd35-kube-api-access-dzp4j\") pod \"kserve-controller-manager-7f7fb4c66f-9zdjk\" (UID: \"e244dc29-8bf9-41c2-8af1-fd6e2a17fd35\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:58.365925 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.365908 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:58.447297 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.447271 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18abb458-fa75-4012-8d1c-81887d5cf548-cert\") pod \"18abb458-fa75-4012-8d1c-81887d5cf548\" (UID: \"18abb458-fa75-4012-8d1c-81887d5cf548\") " Apr 24 16:50:58.447465 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.447309 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvhvq\" (UniqueName: \"kubernetes.io/projected/18abb458-fa75-4012-8d1c-81887d5cf548-kube-api-access-nvhvq\") pod \"18abb458-fa75-4012-8d1c-81887d5cf548\" (UID: \"18abb458-fa75-4012-8d1c-81887d5cf548\") " Apr 24 16:50:58.449462 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.449430 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18abb458-fa75-4012-8d1c-81887d5cf548-cert" (OuterVolumeSpecName: "cert") pod "18abb458-fa75-4012-8d1c-81887d5cf548" (UID: "18abb458-fa75-4012-8d1c-81887d5cf548"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:50:58.449575 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.449484 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18abb458-fa75-4012-8d1c-81887d5cf548-kube-api-access-nvhvq" (OuterVolumeSpecName: "kube-api-access-nvhvq") pod "18abb458-fa75-4012-8d1c-81887d5cf548" (UID: "18abb458-fa75-4012-8d1c-81887d5cf548"). InnerVolumeSpecName "kube-api-access-nvhvq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:50:58.507680 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.507647 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:58.547749 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.547718 2563 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18abb458-fa75-4012-8d1c-81887d5cf548-cert\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:58.547749 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.547747 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvhvq\" (UniqueName: \"kubernetes.io/projected/18abb458-fa75-4012-8d1c-81887d5cf548-kube-api-access-nvhvq\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:50:58.623397 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.623372 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-9zdjk"] Apr 24 16:50:58.626103 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:50:58.626074 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode244dc29_8bf9_41c2_8af1_fd6e2a17fd35.slice/crio-f8013154f8bebbb4eed3b382f4c59b45c92252b59ca380c1e99089041f88efa6 WatchSource:0}: Error finding container f8013154f8bebbb4eed3b382f4c59b45c92252b59ca380c1e99089041f88efa6: Status 404 returned error can't find the container with id f8013154f8bebbb4eed3b382f4c59b45c92252b59ca380c1e99089041f88efa6 Apr 24 16:50:58.923647 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.923559 2563 generic.go:358] "Generic (PLEG): container finished" podID="18abb458-fa75-4012-8d1c-81887d5cf548" containerID="0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b" exitCode=0 Apr 24 16:50:58.923647 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.923625 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" Apr 24 16:50:58.923848 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.923653 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" event={"ID":"18abb458-fa75-4012-8d1c-81887d5cf548","Type":"ContainerDied","Data":"0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b"} Apr 24 16:50:58.923848 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.923686 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-n4ctx" event={"ID":"18abb458-fa75-4012-8d1c-81887d5cf548","Type":"ContainerDied","Data":"1d01deaad6d38e72a76fbeba3e4bc206888dd9c84f86f3eb707921152625fd65"} Apr 24 16:50:58.923848 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.923706 2563 scope.go:117] "RemoveContainer" containerID="0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b" Apr 24 16:50:58.924754 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.924681 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" event={"ID":"e244dc29-8bf9-41c2-8af1-fd6e2a17fd35","Type":"ContainerStarted","Data":"f8013154f8bebbb4eed3b382f4c59b45c92252b59ca380c1e99089041f88efa6"} Apr 24 16:50:58.932110 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.932083 2563 scope.go:117] "RemoveContainer" containerID="0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b" Apr 24 16:50:58.932446 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:50:58.932379 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b\": container with ID starting with 0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b not found: ID does not exist" containerID="0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b" Apr 24 16:50:58.932446 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.932416 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b"} err="failed to get container status \"0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b\": rpc error: code = NotFound desc = could not find container \"0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b\": container with ID starting with 0b23d4acf98aaf00e36ac9f9c9df5397b40b666feef0936ce5e8c66360aba89b not found: ID does not exist" Apr 24 16:50:58.945702 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.945681 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-n4ctx"] Apr 24 16:50:58.948508 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:58.948489 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-n4ctx"] Apr 24 16:50:59.859343 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:59.859309 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18abb458-fa75-4012-8d1c-81887d5cf548" path="/var/lib/kubelet/pods/18abb458-fa75-4012-8d1c-81887d5cf548/volumes" Apr 24 16:50:59.930003 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:59.929972 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" event={"ID":"e244dc29-8bf9-41c2-8af1-fd6e2a17fd35","Type":"ContainerStarted","Data":"827379f341082b77758b6e725cd7c3f020de915c637a340fb7fc5e76f35308c3"} Apr 24 16:50:59.930170 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:59.930086 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:50:59.947101 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:50:59.947060 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" podStartSLOduration=1.5166792930000002 podStartE2EDuration="1.947045964s" podCreationTimestamp="2026-04-24 16:50:58 +0000 UTC" firstStartedPulling="2026-04-24 16:50:58.627402913 +0000 UTC m=+717.298033791" lastFinishedPulling="2026-04-24 16:50:59.05776958 +0000 UTC m=+717.728400462" observedRunningTime="2026-04-24 16:50:59.94628871 +0000 UTC m=+718.616919611" watchObservedRunningTime="2026-04-24 16:50:59.947045964 +0000 UTC m=+718.617676864" Apr 24 16:51:30.939350 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:30.939319 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-9zdjk" Apr 24 16:51:31.884551 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.884517 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-g9g52"] Apr 24 16:51:31.884787 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.884775 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18abb458-fa75-4012-8d1c-81887d5cf548" containerName="manager" Apr 24 16:51:31.884829 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.884789 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="18abb458-fa75-4012-8d1c-81887d5cf548" containerName="manager" Apr 24 16:51:31.884868 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.884834 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="18abb458-fa75-4012-8d1c-81887d5cf548" containerName="manager" Apr 24 16:51:31.887933 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.887913 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:31.890384 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.890364 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 16:51:31.892521 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.892503 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-m2vpj\"" Apr 24 16:51:31.899707 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.899687 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-g9g52"] Apr 24 16:51:31.992596 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.992566 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ml64\" (UniqueName: \"kubernetes.io/projected/c25daac5-251d-47fa-a76a-9a28fd34cfa9-kube-api-access-6ml64\") pod \"model-serving-api-86f7b4b499-g9g52\" (UID: \"c25daac5-251d-47fa-a76a-9a28fd34cfa9\") " pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:31.992925 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:31.992606 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c25daac5-251d-47fa-a76a-9a28fd34cfa9-tls-certs\") pod \"model-serving-api-86f7b4b499-g9g52\" (UID: \"c25daac5-251d-47fa-a76a-9a28fd34cfa9\") " pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:32.093872 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:32.093841 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ml64\" (UniqueName: \"kubernetes.io/projected/c25daac5-251d-47fa-a76a-9a28fd34cfa9-kube-api-access-6ml64\") pod \"model-serving-api-86f7b4b499-g9g52\" (UID: \"c25daac5-251d-47fa-a76a-9a28fd34cfa9\") " pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:32.094026 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:32.093887 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c25daac5-251d-47fa-a76a-9a28fd34cfa9-tls-certs\") pod \"model-serving-api-86f7b4b499-g9g52\" (UID: \"c25daac5-251d-47fa-a76a-9a28fd34cfa9\") " pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:32.096192 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:32.096168 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c25daac5-251d-47fa-a76a-9a28fd34cfa9-tls-certs\") pod \"model-serving-api-86f7b4b499-g9g52\" (UID: \"c25daac5-251d-47fa-a76a-9a28fd34cfa9\") " pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:32.109814 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:32.109792 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ml64\" (UniqueName: \"kubernetes.io/projected/c25daac5-251d-47fa-a76a-9a28fd34cfa9-kube-api-access-6ml64\") pod \"model-serving-api-86f7b4b499-g9g52\" (UID: \"c25daac5-251d-47fa-a76a-9a28fd34cfa9\") " pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:32.198716 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:32.198694 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:32.321928 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:32.321900 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-g9g52"] Apr 24 16:51:32.324004 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:51:32.323972 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc25daac5_251d_47fa_a76a_9a28fd34cfa9.slice/crio-13a09de8f64a46b6f7ca76cfe4eb234a7c3ab43a8340c180430675fcd65339fe WatchSource:0}: Error finding container 13a09de8f64a46b6f7ca76cfe4eb234a7c3ab43a8340c180430675fcd65339fe: Status 404 returned error can't find the container with id 13a09de8f64a46b6f7ca76cfe4eb234a7c3ab43a8340c180430675fcd65339fe Apr 24 16:51:33.028817 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:33.028782 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-g9g52" event={"ID":"c25daac5-251d-47fa-a76a-9a28fd34cfa9","Type":"ContainerStarted","Data":"13a09de8f64a46b6f7ca76cfe4eb234a7c3ab43a8340c180430675fcd65339fe"} Apr 24 16:51:34.033356 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:34.033311 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-g9g52" event={"ID":"c25daac5-251d-47fa-a76a-9a28fd34cfa9","Type":"ContainerStarted","Data":"952181e25695fa776d2b962061a320a74b4b99eb939e0d11601c75ddaad9686d"} Apr 24 16:51:34.033730 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:34.033496 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:51:34.050717 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:34.050677 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-g9g52" podStartSLOduration=1.8204544299999998 podStartE2EDuration="3.050666531s" podCreationTimestamp="2026-04-24 16:51:31 +0000 UTC" firstStartedPulling="2026-04-24 16:51:32.326234298 +0000 UTC m=+750.996865176" lastFinishedPulling="2026-04-24 16:51:33.556446396 +0000 UTC m=+752.227077277" observedRunningTime="2026-04-24 16:51:34.049676826 +0000 UTC m=+752.720307726" watchObservedRunningTime="2026-04-24 16:51:34.050666531 +0000 UTC m=+752.721297431" Apr 24 16:51:45.040970 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:51:45.040942 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-g9g52" Apr 24 16:52:12.975697 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:12.975661 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq"] Apr 24 16:52:12.978148 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:12.978111 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:12.981567 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:12.981387 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:52:12.982118 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:12.982094 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-5h4q4\"" Apr 24 16:52:12.982258 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:12.982168 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:52:12.983740 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:12.983073 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 16:52:12.983994 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:12.983971 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 24 16:52:12.993794 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:12.993776 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq"] Apr 24 16:52:13.100578 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.100540 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2rq6\" (UniqueName: \"kubernetes.io/projected/967142fa-69f6-4781-b3a3-f8b554af93ef-kube-api-access-k2rq6\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.100578 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.100582 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.100815 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.100614 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.100815 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.100725 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.100815 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.100775 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/967142fa-69f6-4781-b3a3-f8b554af93ef-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.100937 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.100817 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201393 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201357 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2rq6\" (UniqueName: \"kubernetes.io/projected/967142fa-69f6-4781-b3a3-f8b554af93ef-kube-api-access-k2rq6\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201393 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201395 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201416 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201451 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201477 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/967142fa-69f6-4781-b3a3-f8b554af93ef-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201648 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201501 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201879 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201853 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201966 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201878 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.201966 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201945 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.202058 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.201957 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.203937 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.203920 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/967142fa-69f6-4781-b3a3-f8b554af93ef-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.211975 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.211951 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2rq6\" (UniqueName: \"kubernetes.io/projected/967142fa-69f6-4781-b3a3-f8b554af93ef-kube-api-access-k2rq6\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.289965 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.289896 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:13.413564 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:13.413543 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq"] Apr 24 16:52:13.415738 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:52:13.415710 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967142fa_69f6_4781_b3a3_f8b554af93ef.slice/crio-891999b3bcb5eabc8bb4ff8ef52a9556b45338ad3780a7e5b9f8cd82bff21381 WatchSource:0}: Error finding container 891999b3bcb5eabc8bb4ff8ef52a9556b45338ad3780a7e5b9f8cd82bff21381: Status 404 returned error can't find the container with id 891999b3bcb5eabc8bb4ff8ef52a9556b45338ad3780a7e5b9f8cd82bff21381 Apr 24 16:52:14.154310 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:14.154274 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" event={"ID":"967142fa-69f6-4781-b3a3-f8b554af93ef","Type":"ContainerStarted","Data":"891999b3bcb5eabc8bb4ff8ef52a9556b45338ad3780a7e5b9f8cd82bff21381"} Apr 24 16:52:17.166451 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:17.166417 2563 generic.go:358] "Generic (PLEG): container finished" podID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerID="9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df" exitCode=0 Apr 24 16:52:17.166805 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:17.166498 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" event={"ID":"967142fa-69f6-4781-b3a3-f8b554af93ef","Type":"ContainerDied","Data":"9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df"} Apr 24 16:52:19.174969 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:19.174928 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" event={"ID":"967142fa-69f6-4781-b3a3-f8b554af93ef","Type":"ContainerStarted","Data":"623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413"} Apr 24 16:52:49.288711 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:49.288675 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" event={"ID":"967142fa-69f6-4781-b3a3-f8b554af93ef","Type":"ContainerStarted","Data":"59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569"} Apr 24 16:52:49.289245 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:49.288912 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:49.291615 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:49.291592 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:49.310231 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:49.310189 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" podStartSLOduration=2.433094527 podStartE2EDuration="37.310175113s" podCreationTimestamp="2026-04-24 16:52:12 +0000 UTC" firstStartedPulling="2026-04-24 16:52:13.417561018 +0000 UTC m=+792.088191906" lastFinishedPulling="2026-04-24 16:52:48.294641613 +0000 UTC m=+826.965272492" observedRunningTime="2026-04-24 16:52:49.308398797 +0000 UTC m=+827.979029700" watchObservedRunningTime="2026-04-24 16:52:49.310175113 +0000 UTC m=+827.980806018" Apr 24 16:52:53.290369 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:53.290326 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:53.290795 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:53.290384 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:52:53.290795 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:52:53.290740 2563 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.29:8082/healthz\": dial tcp 10.132.0.29:8082: connect: connection refused" Apr 24 16:53:03.292551 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:53:03.292519 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:53:03.293937 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:53:03.293911 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:55:15.980758 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:15.980722 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w"] Apr 24 16:55:15.984917 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:15.984892 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:15.987094 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:15.987071 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-p774q\"" Apr 24 16:55:15.987459 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:15.987439 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 24 16:55:15.997571 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:15.997550 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w"] Apr 24 16:55:16.006777 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.006749 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.006885 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.006811 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92lm2\" (UniqueName: \"kubernetes.io/projected/e45408af-70a4-4b6e-9654-f9a854de4e08-kube-api-access-92lm2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.006951 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.006891 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.007001 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.006953 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e45408af-70a4-4b6e-9654-f9a854de4e08-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.007001 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.006991 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.007085 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.007013 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108175 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108116 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92lm2\" (UniqueName: \"kubernetes.io/projected/e45408af-70a4-4b6e-9654-f9a854de4e08-kube-api-access-92lm2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108175 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108176 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108400 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108198 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e45408af-70a4-4b6e-9654-f9a854de4e08-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108400 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108216 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108400 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108234 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108400 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108292 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108625 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108574 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108683 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108636 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108717 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108691 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.108750 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.108718 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.110585 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.110564 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e45408af-70a4-4b6e-9654-f9a854de4e08-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.116417 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.116396 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92lm2\" (UniqueName: \"kubernetes.io/projected/e45408af-70a4-4b6e-9654-f9a854de4e08-kube-api-access-92lm2\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.294289 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.294215 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:16.419842 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.419791 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w"] Apr 24 16:55:16.422228 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:55:16.422202 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45408af_70a4_4b6e_9654_f9a854de4e08.slice/crio-37827e4d530366253cb2416831bce5c16c454aa020fdbe4962b4e1e0d04c0c50 WatchSource:0}: Error finding container 37827e4d530366253cb2416831bce5c16c454aa020fdbe4962b4e1e0d04c0c50: Status 404 returned error can't find the container with id 37827e4d530366253cb2416831bce5c16c454aa020fdbe4962b4e1e0d04c0c50 Apr 24 16:55:16.758530 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.758497 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" event={"ID":"e45408af-70a4-4b6e-9654-f9a854de4e08","Type":"ContainerStarted","Data":"f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d"} Apr 24 16:55:16.758530 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:16.758537 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" event={"ID":"e45408af-70a4-4b6e-9654-f9a854de4e08","Type":"ContainerStarted","Data":"37827e4d530366253cb2416831bce5c16c454aa020fdbe4962b4e1e0d04c0c50"} Apr 24 16:55:17.764362 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:17.764321 2563 generic.go:358] "Generic (PLEG): container finished" podID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerID="f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d" exitCode=0 Apr 24 16:55:17.764362 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:17.764361 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" event={"ID":"e45408af-70a4-4b6e-9654-f9a854de4e08","Type":"ContainerDied","Data":"f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d"} Apr 24 16:55:18.770604 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:18.770569 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" event={"ID":"e45408af-70a4-4b6e-9654-f9a854de4e08","Type":"ContainerStarted","Data":"69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205"} Apr 24 16:55:18.770604 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:18.770607 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" event={"ID":"e45408af-70a4-4b6e-9654-f9a854de4e08","Type":"ContainerStarted","Data":"39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5"} Apr 24 16:55:18.771007 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:18.770773 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:18.792876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:18.792825 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" podStartSLOduration=3.792809582 podStartE2EDuration="3.792809582s" podCreationTimestamp="2026-04-24 16:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:55:18.79178013 +0000 UTC m=+977.462411032" watchObservedRunningTime="2026-04-24 16:55:18.792809582 +0000 UTC m=+977.463440485" Apr 24 16:55:26.294837 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:26.294737 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:26.295214 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:26.294898 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:26.297395 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:26.297371 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:26.800695 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:26.800665 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:55:48.806785 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:55:48.806755 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:56:14.674290 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:14.674249 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq"] Apr 24 16:56:14.674786 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:14.674709 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="main" containerID="cri-o://623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413" gracePeriod=30 Apr 24 16:56:14.674854 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:14.674764 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="tokenizer" containerID="cri-o://59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569" gracePeriod=30 Apr 24 16:56:14.953728 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:14.953691 2563 generic.go:358] "Generic (PLEG): container finished" podID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerID="623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413" exitCode=0 Apr 24 16:56:14.953891 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:14.953769 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" event={"ID":"967142fa-69f6-4781-b3a3-f8b554af93ef","Type":"ContainerDied","Data":"623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413"} Apr 24 16:56:15.817919 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.817900 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:56:15.958078 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.958048 2563 generic.go:358] "Generic (PLEG): container finished" podID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerID="59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569" exitCode=0 Apr 24 16:56:15.958230 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.958121 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" Apr 24 16:56:15.958230 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.958152 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" event={"ID":"967142fa-69f6-4781-b3a3-f8b554af93ef","Type":"ContainerDied","Data":"59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569"} Apr 24 16:56:15.958230 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.958178 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq" event={"ID":"967142fa-69f6-4781-b3a3-f8b554af93ef","Type":"ContainerDied","Data":"891999b3bcb5eabc8bb4ff8ef52a9556b45338ad3780a7e5b9f8cd82bff21381"} Apr 24 16:56:15.958230 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.958197 2563 scope.go:117] "RemoveContainer" containerID="59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569" Apr 24 16:56:15.965691 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.965662 2563 scope.go:117] "RemoveContainer" containerID="623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413" Apr 24 16:56:15.972435 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.972420 2563 scope.go:117] "RemoveContainer" containerID="9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df" Apr 24 16:56:15.974320 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974302 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-cache\") pod \"967142fa-69f6-4781-b3a3-f8b554af93ef\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " Apr 24 16:56:15.974399 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974376 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-tmp\") pod \"967142fa-69f6-4781-b3a3-f8b554af93ef\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " Apr 24 16:56:15.974449 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974414 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2rq6\" (UniqueName: \"kubernetes.io/projected/967142fa-69f6-4781-b3a3-f8b554af93ef-kube-api-access-k2rq6\") pod \"967142fa-69f6-4781-b3a3-f8b554af93ef\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " Apr 24 16:56:15.974449 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974442 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/967142fa-69f6-4781-b3a3-f8b554af93ef-tls-certs\") pod \"967142fa-69f6-4781-b3a3-f8b554af93ef\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " Apr 24 16:56:15.974522 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974485 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-uds\") pod \"967142fa-69f6-4781-b3a3-f8b554af93ef\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " Apr 24 16:56:15.974560 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974527 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "967142fa-69f6-4781-b3a3-f8b554af93ef" (UID: "967142fa-69f6-4781-b3a3-f8b554af93ef"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.974657 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974638 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-kserve-provision-location\") pod \"967142fa-69f6-4781-b3a3-f8b554af93ef\" (UID: \"967142fa-69f6-4781-b3a3-f8b554af93ef\") " Apr 24 16:56:15.974725 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974711 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "967142fa-69f6-4781-b3a3-f8b554af93ef" (UID: "967142fa-69f6-4781-b3a3-f8b554af93ef"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.974776 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974736 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "967142fa-69f6-4781-b3a3-f8b554af93ef" (UID: "967142fa-69f6-4781-b3a3-f8b554af93ef"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.974981 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974962 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.975060 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974986 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.975060 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.974999 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:15.975468 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.975442 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "967142fa-69f6-4781-b3a3-f8b554af93ef" (UID: "967142fa-69f6-4781-b3a3-f8b554af93ef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:15.976683 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.976652 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/967142fa-69f6-4781-b3a3-f8b554af93ef-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "967142fa-69f6-4781-b3a3-f8b554af93ef" (UID: "967142fa-69f6-4781-b3a3-f8b554af93ef"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:56:15.976683 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.976669 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967142fa-69f6-4781-b3a3-f8b554af93ef-kube-api-access-k2rq6" (OuterVolumeSpecName: "kube-api-access-k2rq6") pod "967142fa-69f6-4781-b3a3-f8b554af93ef" (UID: "967142fa-69f6-4781-b3a3-f8b554af93ef"). InnerVolumeSpecName "kube-api-access-k2rq6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:56:15.979581 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.979566 2563 scope.go:117] "RemoveContainer" containerID="59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569" Apr 24 16:56:15.979830 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:56:15.979813 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569\": container with ID starting with 59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569 not found: ID does not exist" containerID="59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569" Apr 24 16:56:15.979886 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.979835 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569"} err="failed to get container status \"59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569\": rpc error: code = NotFound desc = could not find container \"59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569\": container with ID starting with 59a1a629373e664e465ffceb53aa3758ecf16a4890642abb1a8a0e9952a76569 not found: ID does not exist" Apr 24 16:56:15.979886 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.979852 2563 scope.go:117] "RemoveContainer" containerID="623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413" Apr 24 16:56:15.980063 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:56:15.980049 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413\": container with ID starting with 623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413 not found: ID does not exist" containerID="623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413" Apr 24 16:56:15.980106 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.980066 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413"} err="failed to get container status \"623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413\": rpc error: code = NotFound desc = could not find container \"623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413\": container with ID starting with 623b08545873a26f2d7d8a200aee8da9e8c5f123ad459fb10abb092331b65413 not found: ID does not exist" Apr 24 16:56:15.980106 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.980078 2563 scope.go:117] "RemoveContainer" containerID="9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df" Apr 24 16:56:15.980306 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:56:15.980290 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df\": container with ID starting with 9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df not found: ID does not exist" containerID="9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df" Apr 24 16:56:15.980346 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:15.980310 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df"} err="failed to get container status \"9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df\": rpc error: code = NotFound desc = could not find container \"9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df\": container with ID starting with 9e9352cf9ed04bcc7df0fb5dfd1862ba2b343efb5f95998f5435c101ab71c2df not found: ID does not exist" Apr 24 16:56:16.075463 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:16.075438 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k2rq6\" (UniqueName: \"kubernetes.io/projected/967142fa-69f6-4781-b3a3-f8b554af93ef-kube-api-access-k2rq6\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:16.075463 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:16.075462 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/967142fa-69f6-4781-b3a3-f8b554af93ef-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:16.075608 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:16.075472 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/967142fa-69f6-4781-b3a3-f8b554af93ef-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:16.282700 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:16.282660 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq"] Apr 24 16:56:16.284628 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:16.284602 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7fd7bw8rfq"] Apr 24 16:56:17.859103 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:17.859069 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" path="/var/lib/kubelet/pods/967142fa-69f6-4781-b3a3-f8b554af93ef/volumes" Apr 24 16:56:24.884062 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.884030 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g"] Apr 24 16:56:24.890289 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.890251 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="main" Apr 24 16:56:24.890289 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.890282 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="main" Apr 24 16:56:24.890494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.890308 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="tokenizer" Apr 24 16:56:24.890494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.890314 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="tokenizer" Apr 24 16:56:24.890494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.890323 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="storage-initializer" Apr 24 16:56:24.890494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.890330 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="storage-initializer" Apr 24 16:56:24.890494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.890390 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="tokenizer" Apr 24 16:56:24.890494 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.890401 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="967142fa-69f6-4781-b3a3-f8b554af93ef" containerName="main" Apr 24 16:56:24.895013 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.894995 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:24.898286 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.898268 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-lkpcj\"" Apr 24 16:56:24.898394 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.898364 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 24 16:56:24.904039 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.903998 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g"] Apr 24 16:56:24.945321 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.945286 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:24.945452 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.945332 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:24.945452 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.945368 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:24.945452 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.945406 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:24.945452 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.945437 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88q5v\" (UniqueName: \"kubernetes.io/projected/35074b93-5ad6-4d3d-90d8-f6091fc90638-kube-api-access-88q5v\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:24.945610 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:24.945473 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35074b93-5ad6-4d3d-90d8-f6091fc90638-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046234 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046207 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046378 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046240 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046443 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046423 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88q5v\" (UniqueName: \"kubernetes.io/projected/35074b93-5ad6-4d3d-90d8-f6091fc90638-kube-api-access-88q5v\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046480 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046473 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35074b93-5ad6-4d3d-90d8-f6091fc90638-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046532 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046501 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046532 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046527 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046633 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046594 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046633 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046609 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046792 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046775 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.046856 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.046823 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.048848 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.048828 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35074b93-5ad6-4d3d-90d8-f6091fc90638-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.057316 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.057295 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88q5v\" (UniqueName: \"kubernetes.io/projected/35074b93-5ad6-4d3d-90d8-f6091fc90638-kube-api-access-88q5v\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.204631 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.204601 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:25.345110 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.345079 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g"] Apr 24 16:56:25.348873 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:56:25.348839 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35074b93_5ad6_4d3d_90d8_f6091fc90638.slice/crio-6e5d0bd905c7f20d7573c25a1296bcdf03c9c014d70f0fcacc0c79ff30de6966 WatchSource:0}: Error finding container 6e5d0bd905c7f20d7573c25a1296bcdf03c9c014d70f0fcacc0c79ff30de6966: Status 404 returned error can't find the container with id 6e5d0bd905c7f20d7573c25a1296bcdf03c9c014d70f0fcacc0c79ff30de6966 Apr 24 16:56:25.350867 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.350851 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:56:25.995110 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.995079 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" event={"ID":"35074b93-5ad6-4d3d-90d8-f6091fc90638","Type":"ContainerStarted","Data":"99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a"} Apr 24 16:56:25.995110 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:25.995113 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" event={"ID":"35074b93-5ad6-4d3d-90d8-f6091fc90638","Type":"ContainerStarted","Data":"6e5d0bd905c7f20d7573c25a1296bcdf03c9c014d70f0fcacc0c79ff30de6966"} Apr 24 16:56:26.998797 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:26.998762 2563 generic.go:358] "Generic (PLEG): container finished" podID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerID="99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a" exitCode=0 Apr 24 16:56:26.999202 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:26.998845 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" event={"ID":"35074b93-5ad6-4d3d-90d8-f6091fc90638","Type":"ContainerDied","Data":"99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a"} Apr 24 16:56:28.004024 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:28.003986 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" event={"ID":"35074b93-5ad6-4d3d-90d8-f6091fc90638","Type":"ContainerStarted","Data":"c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6"} Apr 24 16:56:28.004417 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:28.004030 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" event={"ID":"35074b93-5ad6-4d3d-90d8-f6091fc90638","Type":"ContainerStarted","Data":"24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418"} Apr 24 16:56:28.004417 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:28.004125 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:28.035281 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:28.035222 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" podStartSLOduration=4.035203225 podStartE2EDuration="4.035203225s" podCreationTimestamp="2026-04-24 16:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:56:28.030282766 +0000 UTC m=+1046.700913678" watchObservedRunningTime="2026-04-24 16:56:28.035203225 +0000 UTC m=+1046.705834126" Apr 24 16:56:35.205321 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:35.205290 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:35.205321 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:35.205319 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:35.207895 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:35.207870 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:36.031307 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:36.031271 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:57.034877 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:57.034802 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:58.256788 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:58.256749 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g"] Apr 24 16:56:58.257215 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:58.257163 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="main" containerID="cri-o://24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418" gracePeriod=30 Apr 24 16:56:58.257215 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:58.257196 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="tokenizer" containerID="cri-o://c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6" gracePeriod=30 Apr 24 16:56:59.102763 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.102729 2563 generic.go:358] "Generic (PLEG): container finished" podID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerID="24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418" exitCode=0 Apr 24 16:56:59.102931 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.102785 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" event={"ID":"35074b93-5ad6-4d3d-90d8-f6091fc90638","Type":"ContainerDied","Data":"24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418"} Apr 24 16:56:59.395231 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.395209 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:56:59.503785 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.503758 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-uds\") pod \"35074b93-5ad6-4d3d-90d8-f6091fc90638\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " Apr 24 16:56:59.503924 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.503791 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-kserve-provision-location\") pod \"35074b93-5ad6-4d3d-90d8-f6091fc90638\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " Apr 24 16:56:59.503924 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.503819 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-tmp\") pod \"35074b93-5ad6-4d3d-90d8-f6091fc90638\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " Apr 24 16:56:59.503924 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.503852 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35074b93-5ad6-4d3d-90d8-f6091fc90638-tls-certs\") pod \"35074b93-5ad6-4d3d-90d8-f6091fc90638\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " Apr 24 16:56:59.504090 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.503928 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-cache\") pod \"35074b93-5ad6-4d3d-90d8-f6091fc90638\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " Apr 24 16:56:59.504090 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.503971 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88q5v\" (UniqueName: \"kubernetes.io/projected/35074b93-5ad6-4d3d-90d8-f6091fc90638-kube-api-access-88q5v\") pod \"35074b93-5ad6-4d3d-90d8-f6091fc90638\" (UID: \"35074b93-5ad6-4d3d-90d8-f6091fc90638\") " Apr 24 16:56:59.504090 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.504034 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "35074b93-5ad6-4d3d-90d8-f6091fc90638" (UID: "35074b93-5ad6-4d3d-90d8-f6091fc90638"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:59.504278 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.504219 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:59.504278 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.504251 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "35074b93-5ad6-4d3d-90d8-f6091fc90638" (UID: "35074b93-5ad6-4d3d-90d8-f6091fc90638"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:59.504278 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.504264 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "35074b93-5ad6-4d3d-90d8-f6091fc90638" (UID: "35074b93-5ad6-4d3d-90d8-f6091fc90638"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:59.504561 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.504537 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "35074b93-5ad6-4d3d-90d8-f6091fc90638" (UID: "35074b93-5ad6-4d3d-90d8-f6091fc90638"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:56:59.506016 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.505991 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35074b93-5ad6-4d3d-90d8-f6091fc90638-kube-api-access-88q5v" (OuterVolumeSpecName: "kube-api-access-88q5v") pod "35074b93-5ad6-4d3d-90d8-f6091fc90638" (UID: "35074b93-5ad6-4d3d-90d8-f6091fc90638"). InnerVolumeSpecName "kube-api-access-88q5v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:56:59.506111 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.506023 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35074b93-5ad6-4d3d-90d8-f6091fc90638-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "35074b93-5ad6-4d3d-90d8-f6091fc90638" (UID: "35074b93-5ad6-4d3d-90d8-f6091fc90638"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:56:59.604683 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.604645 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:59.604683 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.604679 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88q5v\" (UniqueName: \"kubernetes.io/projected/35074b93-5ad6-4d3d-90d8-f6091fc90638-kube-api-access-88q5v\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:59.604866 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.604694 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:59.604866 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.604706 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/35074b93-5ad6-4d3d-90d8-f6091fc90638-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:56:59.604866 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:56:59.604719 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35074b93-5ad6-4d3d-90d8-f6091fc90638-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:57:00.108557 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.108526 2563 generic.go:358] "Generic (PLEG): container finished" podID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerID="c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6" exitCode=0 Apr 24 16:57:00.108711 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.108587 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" event={"ID":"35074b93-5ad6-4d3d-90d8-f6091fc90638","Type":"ContainerDied","Data":"c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6"} Apr 24 16:57:00.108711 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.108611 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" event={"ID":"35074b93-5ad6-4d3d-90d8-f6091fc90638","Type":"ContainerDied","Data":"6e5d0bd905c7f20d7573c25a1296bcdf03c9c014d70f0fcacc0c79ff30de6966"} Apr 24 16:57:00.108711 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.108608 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g" Apr 24 16:57:00.108711 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.108687 2563 scope.go:117] "RemoveContainer" containerID="c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6" Apr 24 16:57:00.116197 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.116181 2563 scope.go:117] "RemoveContainer" containerID="24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418" Apr 24 16:57:00.124613 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.124589 2563 scope.go:117] "RemoveContainer" containerID="99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a" Apr 24 16:57:00.127360 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.127339 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g"] Apr 24 16:57:00.131431 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.131412 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-df6c94xb5g"] Apr 24 16:57:00.132245 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.132230 2563 scope.go:117] "RemoveContainer" containerID="c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6" Apr 24 16:57:00.132479 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:57:00.132460 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6\": container with ID starting with c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6 not found: ID does not exist" containerID="c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6" Apr 24 16:57:00.132547 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.132491 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6"} err="failed to get container status \"c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6\": rpc error: code = NotFound desc = could not find container \"c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6\": container with ID starting with c342378ea7c4c7fdb0b038cff30ae5bb350c6747e80110195836cbe0c60c5ff6 not found: ID does not exist" Apr 24 16:57:00.132547 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.132518 2563 scope.go:117] "RemoveContainer" containerID="24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418" Apr 24 16:57:00.132772 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:57:00.132755 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418\": container with ID starting with 24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418 not found: ID does not exist" containerID="24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418" Apr 24 16:57:00.132812 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.132779 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418"} err="failed to get container status \"24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418\": rpc error: code = NotFound desc = could not find container \"24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418\": container with ID starting with 24978cc97f159a5fc89d33b8a8239d6c55693ecb60df7eee6f5d65bbbf901418 not found: ID does not exist" Apr 24 16:57:00.132812 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.132795 2563 scope.go:117] "RemoveContainer" containerID="99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a" Apr 24 16:57:00.133007 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:57:00.132991 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a\": container with ID starting with 99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a not found: ID does not exist" containerID="99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a" Apr 24 16:57:00.133051 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:00.133012 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a"} err="failed to get container status \"99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a\": rpc error: code = NotFound desc = could not find container \"99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a\": container with ID starting with 99b60de514ec3cd8179467eec30fe291983cf54e4b76f40c0370f176dc4fef8a not found: ID does not exist" Apr 24 16:57:01.858801 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:01.858763 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" path="/var/lib/kubelet/pods/35074b93-5ad6-4d3d-90d8-f6091fc90638/volumes" Apr 24 16:57:10.156222 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156195 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5"] Apr 24 16:57:10.156600 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156470 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="tokenizer" Apr 24 16:57:10.156600 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156480 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="tokenizer" Apr 24 16:57:10.156600 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156497 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="storage-initializer" Apr 24 16:57:10.156600 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156503 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="storage-initializer" Apr 24 16:57:10.156600 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156511 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="main" Apr 24 16:57:10.156600 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156516 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="main" Apr 24 16:57:10.156600 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156581 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="tokenizer" Apr 24 16:57:10.156600 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.156590 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="35074b93-5ad6-4d3d-90d8-f6091fc90638" containerName="main" Apr 24 16:57:10.161435 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.161411 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.163709 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.163686 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 16:57:10.163845 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.163735 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-hvncj\"" Apr 24 16:57:10.178545 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.178504 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5"] Apr 24 16:57:10.289616 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.289582 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.289768 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.289642 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.289768 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.289665 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.289768 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.289684 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.289768 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.289706 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.289768 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.289729 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cjm5\" (UniqueName: \"kubernetes.io/projected/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kube-api-access-6cjm5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.390436 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.390404 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.390436 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.390441 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.390671 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.390463 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.390671 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.390481 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.390671 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.390649 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cjm5\" (UniqueName: \"kubernetes.io/projected/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kube-api-access-6cjm5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.390836 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.390734 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.390836 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.390820 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.390933 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.390890 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.391076 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.391057 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.391180 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.391164 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.392900 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.392883 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.399281 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.399262 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cjm5\" (UniqueName: \"kubernetes.io/projected/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kube-api-access-6cjm5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.471011 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.470982 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:10.596270 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:10.596247 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5"] Apr 24 16:57:10.598265 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:57:10.598237 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79db7c52_88b4_4ba5_99c9_56bec7efe7d6.slice/crio-a7de50d453d7d3a06e25fd1aa27db8d25833062f3833fc1c8409f18db34e81b7 WatchSource:0}: Error finding container a7de50d453d7d3a06e25fd1aa27db8d25833062f3833fc1c8409f18db34e81b7: Status 404 returned error can't find the container with id a7de50d453d7d3a06e25fd1aa27db8d25833062f3833fc1c8409f18db34e81b7 Apr 24 16:57:11.147872 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:11.147841 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" event={"ID":"79db7c52-88b4-4ba5-99c9-56bec7efe7d6","Type":"ContainerStarted","Data":"c11e2a7655bbaf8e1c9a662d921590bb3946ec3fecd8b5ff1e73bdf714044f84"} Apr 24 16:57:11.147872 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:11.147876 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" event={"ID":"79db7c52-88b4-4ba5-99c9-56bec7efe7d6","Type":"ContainerStarted","Data":"a7de50d453d7d3a06e25fd1aa27db8d25833062f3833fc1c8409f18db34e81b7"} Apr 24 16:57:12.152396 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:12.152362 2563 generic.go:358] "Generic (PLEG): container finished" podID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerID="c11e2a7655bbaf8e1c9a662d921590bb3946ec3fecd8b5ff1e73bdf714044f84" exitCode=0 Apr 24 16:57:12.152396 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:12.152401 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" event={"ID":"79db7c52-88b4-4ba5-99c9-56bec7efe7d6","Type":"ContainerDied","Data":"c11e2a7655bbaf8e1c9a662d921590bb3946ec3fecd8b5ff1e73bdf714044f84"} Apr 24 16:57:13.157701 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:13.157663 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" event={"ID":"79db7c52-88b4-4ba5-99c9-56bec7efe7d6","Type":"ContainerStarted","Data":"bb2f5ad9df981f0d91bd91dd49287cb3ad8205e7872d744364a1d8b350d8cec5"} Apr 24 16:57:13.157701 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:13.157706 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" event={"ID":"79db7c52-88b4-4ba5-99c9-56bec7efe7d6","Type":"ContainerStarted","Data":"ed85d84613da436d8fe145826e903336619616d56a2f074c73d2da9fc0ab7856"} Apr 24 16:57:13.158255 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:13.157828 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:13.191155 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:13.191085 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" podStartSLOduration=3.191068089 podStartE2EDuration="3.191068089s" podCreationTimestamp="2026-04-24 16:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:57:13.188692066 +0000 UTC m=+1091.859322967" watchObservedRunningTime="2026-04-24 16:57:13.191068089 +0000 UTC m=+1091.861698994" Apr 24 16:57:20.471498 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:20.471461 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:20.471918 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:20.471511 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:20.474460 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:20.474414 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:21.186293 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:21.186264 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:41.834883 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:41.834724 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w"] Apr 24 16:57:41.835422 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:41.835123 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="main" containerID="cri-o://39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5" gracePeriod=30 Apr 24 16:57:41.835422 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:41.835182 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="tokenizer" containerID="cri-o://69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205" gracePeriod=30 Apr 24 16:57:42.253453 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:42.253421 2563 generic.go:358] "Generic (PLEG): container finished" podID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerID="39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5" exitCode=0 Apr 24 16:57:42.253453 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:42.253461 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" event={"ID":"e45408af-70a4-4b6e-9654-f9a854de4e08","Type":"ContainerDied","Data":"39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5"} Apr 24 16:57:42.987361 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:42.987337 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:57:43.052066 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052031 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-uds\") pod \"e45408af-70a4-4b6e-9654-f9a854de4e08\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " Apr 24 16:57:43.052354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052083 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92lm2\" (UniqueName: \"kubernetes.io/projected/e45408af-70a4-4b6e-9654-f9a854de4e08-kube-api-access-92lm2\") pod \"e45408af-70a4-4b6e-9654-f9a854de4e08\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " Apr 24 16:57:43.052354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052115 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e45408af-70a4-4b6e-9654-f9a854de4e08-tls-certs\") pod \"e45408af-70a4-4b6e-9654-f9a854de4e08\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " Apr 24 16:57:43.052354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052161 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-kserve-provision-location\") pod \"e45408af-70a4-4b6e-9654-f9a854de4e08\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " Apr 24 16:57:43.052354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052204 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-cache\") pod \"e45408af-70a4-4b6e-9654-f9a854de4e08\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " Apr 24 16:57:43.052354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052236 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-tmp\") pod \"e45408af-70a4-4b6e-9654-f9a854de4e08\" (UID: \"e45408af-70a4-4b6e-9654-f9a854de4e08\") " Apr 24 16:57:43.052354 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052330 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e45408af-70a4-4b6e-9654-f9a854de4e08" (UID: "e45408af-70a4-4b6e-9654-f9a854de4e08"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:57:43.052641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052528 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:57:43.052641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052522 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e45408af-70a4-4b6e-9654-f9a854de4e08" (UID: "e45408af-70a4-4b6e-9654-f9a854de4e08"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:57:43.052641 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052619 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e45408af-70a4-4b6e-9654-f9a854de4e08" (UID: "e45408af-70a4-4b6e-9654-f9a854de4e08"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:57:43.052915 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.052893 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e45408af-70a4-4b6e-9654-f9a854de4e08" (UID: "e45408af-70a4-4b6e-9654-f9a854de4e08"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:57:43.054251 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.054231 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45408af-70a4-4b6e-9654-f9a854de4e08-kube-api-access-92lm2" (OuterVolumeSpecName: "kube-api-access-92lm2") pod "e45408af-70a4-4b6e-9654-f9a854de4e08" (UID: "e45408af-70a4-4b6e-9654-f9a854de4e08"). InnerVolumeSpecName "kube-api-access-92lm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:57:43.054317 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.054261 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45408af-70a4-4b6e-9654-f9a854de4e08-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e45408af-70a4-4b6e-9654-f9a854de4e08" (UID: "e45408af-70a4-4b6e-9654-f9a854de4e08"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:57:43.153491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.153414 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:57:43.153491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.153442 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:57:43.153491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.153452 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-92lm2\" (UniqueName: \"kubernetes.io/projected/e45408af-70a4-4b6e-9654-f9a854de4e08-kube-api-access-92lm2\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:57:43.153491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.153462 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e45408af-70a4-4b6e-9654-f9a854de4e08-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:57:43.153491 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.153472 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e45408af-70a4-4b6e-9654-f9a854de4e08-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:57:43.192888 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.192860 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:57:43.258107 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.258075 2563 generic.go:358] "Generic (PLEG): container finished" podID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerID="69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205" exitCode=0 Apr 24 16:57:43.258276 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.258172 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" event={"ID":"e45408af-70a4-4b6e-9654-f9a854de4e08","Type":"ContainerDied","Data":"69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205"} Apr 24 16:57:43.258276 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.258211 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" event={"ID":"e45408af-70a4-4b6e-9654-f9a854de4e08","Type":"ContainerDied","Data":"37827e4d530366253cb2416831bce5c16c454aa020fdbe4962b4e1e0d04c0c50"} Apr 24 16:57:43.258276 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.258233 2563 scope.go:117] "RemoveContainer" containerID="69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205" Apr 24 16:57:43.258276 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.258179 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w" Apr 24 16:57:43.266848 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.266833 2563 scope.go:117] "RemoveContainer" containerID="39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5" Apr 24 16:57:43.274111 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.274094 2563 scope.go:117] "RemoveContainer" containerID="f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d" Apr 24 16:57:43.279794 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.279772 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w"] Apr 24 16:57:43.281837 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.281818 2563 scope.go:117] "RemoveContainer" containerID="69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205" Apr 24 16:57:43.282204 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.282114 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scherdt4w"] Apr 24 16:57:43.282276 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:57:43.282112 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205\": container with ID starting with 69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205 not found: ID does not exist" containerID="69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205" Apr 24 16:57:43.282276 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.282237 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205"} err="failed to get container status \"69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205\": rpc error: code = NotFound desc = could not find container \"69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205\": container with ID starting with 69d5ceee1fa5bc220b411a1195869c49fa6bdcdf4b5dd99727be6b8622d00205 not found: ID does not exist" Apr 24 16:57:43.282276 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.282258 2563 scope.go:117] "RemoveContainer" containerID="39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5" Apr 24 16:57:43.282492 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:57:43.282473 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5\": container with ID starting with 39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5 not found: ID does not exist" containerID="39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5" Apr 24 16:57:43.282532 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.282498 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5"} err="failed to get container status \"39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5\": rpc error: code = NotFound desc = could not find container \"39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5\": container with ID starting with 39e1af496afe3c8e7ed7ca95fbb8196004a29476c8c1c4b88991d5bf095572c5 not found: ID does not exist" Apr 24 16:57:43.282532 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.282514 2563 scope.go:117] "RemoveContainer" containerID="f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d" Apr 24 16:57:43.282742 ip-10-0-137-69 kubenswrapper[2563]: E0424 16:57:43.282723 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d\": container with ID starting with f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d not found: ID does not exist" containerID="f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d" Apr 24 16:57:43.282783 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.282748 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d"} err="failed to get container status \"f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d\": rpc error: code = NotFound desc = could not find container \"f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d\": container with ID starting with f0fbb2cd306c75dd41c5edcf08389b95137e7f0c35214349607a03f883d52b4d not found: ID does not exist" Apr 24 16:57:43.864969 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:57:43.864933 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" path="/var/lib/kubelet/pods/e45408af-70a4-4b6e-9654-f9a854de4e08/volumes" Apr 24 16:58:00.651798 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.651755 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9"] Apr 24 16:58:00.652346 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.652321 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="main" Apr 24 16:58:00.652502 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.652487 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="main" Apr 24 16:58:00.652599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.652558 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="storage-initializer" Apr 24 16:58:00.652599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.652572 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="storage-initializer" Apr 24 16:58:00.652599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.652590 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="tokenizer" Apr 24 16:58:00.652599 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.652598 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="tokenizer" Apr 24 16:58:00.652806 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.652712 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="main" Apr 24 16:58:00.652806 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.652727 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="e45408af-70a4-4b6e-9654-f9a854de4e08" containerName="tokenizer" Apr 24 16:58:00.656050 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.656025 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.658437 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.658412 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-lcd6c\"" Apr 24 16:58:00.658559 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.658501 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 24 16:58:00.671407 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.671358 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9"] Apr 24 16:58:00.685687 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.685664 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.685799 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.685699 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.685799 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.685734 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.685799 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.685766 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.685799 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.685784 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.685933 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.685809 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb5kb\" (UniqueName: \"kubernetes.io/projected/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kube-api-access-qb5kb\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787125 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787094 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787125 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787131 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787339 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787183 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787339 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787221 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787339 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787248 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787339 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787273 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb5kb\" (UniqueName: \"kubernetes.io/projected/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kube-api-access-qb5kb\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787583 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787559 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787621 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787600 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787655 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787632 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.787688 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.787660 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.789688 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.789658 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.796278 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.796256 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb5kb\" (UniqueName: \"kubernetes.io/projected/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kube-api-access-qb5kb\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:00.966145 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:00.966099 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:01.088832 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:01.088797 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9"] Apr 24 16:58:01.092478 ip-10-0-137-69 kubenswrapper[2563]: W0424 16:58:01.092452 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1cd1b1d_4405_48dd_b7b8_d0a32691e53c.slice/crio-90b4b16df34f28b355a3fd88157f7040eddf3b0fa92461381f1c26b39aae59e3 WatchSource:0}: Error finding container 90b4b16df34f28b355a3fd88157f7040eddf3b0fa92461381f1c26b39aae59e3: Status 404 returned error can't find the container with id 90b4b16df34f28b355a3fd88157f7040eddf3b0fa92461381f1c26b39aae59e3 Apr 24 16:58:01.322583 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:01.322499 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" event={"ID":"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c","Type":"ContainerStarted","Data":"0147385a053a6bb6ebc5796fd8ff057dd179e35a6f4bf3df37fb515b991b01d1"} Apr 24 16:58:01.322583 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:01.322538 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" event={"ID":"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c","Type":"ContainerStarted","Data":"90b4b16df34f28b355a3fd88157f7040eddf3b0fa92461381f1c26b39aae59e3"} Apr 24 16:58:02.326830 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:02.326798 2563 generic.go:358] "Generic (PLEG): container finished" podID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerID="0147385a053a6bb6ebc5796fd8ff057dd179e35a6f4bf3df37fb515b991b01d1" exitCode=0 Apr 24 16:58:02.327225 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:02.326882 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" event={"ID":"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c","Type":"ContainerDied","Data":"0147385a053a6bb6ebc5796fd8ff057dd179e35a6f4bf3df37fb515b991b01d1"} Apr 24 16:58:03.332995 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:03.332963 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" event={"ID":"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c","Type":"ContainerStarted","Data":"65b509f1bdb85e68e240c10b0b56c7aee0d443a91f802d1733479cfa90947248"} Apr 24 16:58:03.332995 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:03.332996 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" event={"ID":"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c","Type":"ContainerStarted","Data":"30fc3df909182642de5f90cbf617a6029f08023d3493fa4a84fc997a3bf7ced5"} Apr 24 16:58:03.333417 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:03.333175 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:03.356428 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:03.356372 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" podStartSLOduration=3.356354052 podStartE2EDuration="3.356354052s" podCreationTimestamp="2026-04-24 16:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:58:03.355668004 +0000 UTC m=+1142.026298908" watchObservedRunningTime="2026-04-24 16:58:03.356354052 +0000 UTC m=+1142.026984954" Apr 24 16:58:10.967055 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:10.967020 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:10.967055 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:10.967057 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:10.969826 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:10.969800 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:11.360095 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:11.360010 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:58:26.394574 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:26.394486 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5"] Apr 24 16:58:26.395221 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:26.394864 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="main" containerID="cri-o://ed85d84613da436d8fe145826e903336619616d56a2f074c73d2da9fc0ab7856" gracePeriod=30 Apr 24 16:58:26.395221 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:26.394947 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="tokenizer" containerID="cri-o://bb2f5ad9df981f0d91bd91dd49287cb3ad8205e7872d744364a1d8b350d8cec5" gracePeriod=30 Apr 24 16:58:27.412317 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.412283 2563 generic.go:358] "Generic (PLEG): container finished" podID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerID="bb2f5ad9df981f0d91bd91dd49287cb3ad8205e7872d744364a1d8b350d8cec5" exitCode=0 Apr 24 16:58:27.412317 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.412315 2563 generic.go:358] "Generic (PLEG): container finished" podID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerID="ed85d84613da436d8fe145826e903336619616d56a2f074c73d2da9fc0ab7856" exitCode=0 Apr 24 16:58:27.412646 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.412348 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" event={"ID":"79db7c52-88b4-4ba5-99c9-56bec7efe7d6","Type":"ContainerDied","Data":"bb2f5ad9df981f0d91bd91dd49287cb3ad8205e7872d744364a1d8b350d8cec5"} Apr 24 16:58:27.412646 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.412386 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" event={"ID":"79db7c52-88b4-4ba5-99c9-56bec7efe7d6","Type":"ContainerDied","Data":"ed85d84613da436d8fe145826e903336619616d56a2f074c73d2da9fc0ab7856"} Apr 24 16:58:27.537171 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.537149 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:58:27.711623 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.711599 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-cache\") pod \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " Apr 24 16:58:27.711782 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.711639 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-uds\") pod \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " Apr 24 16:58:27.711782 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.711688 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cjm5\" (UniqueName: \"kubernetes.io/projected/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kube-api-access-6cjm5\") pod \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " Apr 24 16:58:27.711782 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.711719 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tls-certs\") pod \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " Apr 24 16:58:27.711782 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.711739 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-tmp\") pod \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " Apr 24 16:58:27.711782 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.711763 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kserve-provision-location\") pod \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\" (UID: \"79db7c52-88b4-4ba5-99c9-56bec7efe7d6\") " Apr 24 16:58:27.712081 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.711862 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "79db7c52-88b4-4ba5-99c9-56bec7efe7d6" (UID: "79db7c52-88b4-4ba5-99c9-56bec7efe7d6"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:27.712081 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.712005 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.712081 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.712052 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "79db7c52-88b4-4ba5-99c9-56bec7efe7d6" (UID: "79db7c52-88b4-4ba5-99c9-56bec7efe7d6"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:27.712228 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.712104 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "79db7c52-88b4-4ba5-99c9-56bec7efe7d6" (UID: "79db7c52-88b4-4ba5-99c9-56bec7efe7d6"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:27.712531 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.712509 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "79db7c52-88b4-4ba5-99c9-56bec7efe7d6" (UID: "79db7c52-88b4-4ba5-99c9-56bec7efe7d6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:27.713781 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.713760 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "79db7c52-88b4-4ba5-99c9-56bec7efe7d6" (UID: "79db7c52-88b4-4ba5-99c9-56bec7efe7d6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:58:27.713851 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.713801 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kube-api-access-6cjm5" (OuterVolumeSpecName: "kube-api-access-6cjm5") pod "79db7c52-88b4-4ba5-99c9-56bec7efe7d6" (UID: "79db7c52-88b4-4ba5-99c9-56bec7efe7d6"). InnerVolumeSpecName "kube-api-access-6cjm5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:58:27.812938 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.812904 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.812938 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.812939 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cjm5\" (UniqueName: \"kubernetes.io/projected/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kube-api-access-6cjm5\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.813106 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.812951 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.813106 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.812962 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:58:27.813106 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:27.812970 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79db7c52-88b4-4ba5-99c9-56bec7efe7d6-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:58:28.416751 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:28.416714 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" event={"ID":"79db7c52-88b4-4ba5-99c9-56bec7efe7d6","Type":"ContainerDied","Data":"a7de50d453d7d3a06e25fd1aa27db8d25833062f3833fc1c8409f18db34e81b7"} Apr 24 16:58:28.416751 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:28.416737 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5" Apr 24 16:58:28.417259 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:28.416768 2563 scope.go:117] "RemoveContainer" containerID="bb2f5ad9df981f0d91bd91dd49287cb3ad8205e7872d744364a1d8b350d8cec5" Apr 24 16:58:28.424630 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:28.424608 2563 scope.go:117] "RemoveContainer" containerID="ed85d84613da436d8fe145826e903336619616d56a2f074c73d2da9fc0ab7856" Apr 24 16:58:28.431547 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:28.431528 2563 scope.go:117] "RemoveContainer" containerID="c11e2a7655bbaf8e1c9a662d921590bb3946ec3fecd8b5ff1e73bdf714044f84" Apr 24 16:58:28.435538 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:28.435512 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5"] Apr 24 16:58:28.439694 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:28.439674 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-8c86675ntdc5"] Apr 24 16:58:29.859296 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:29.859255 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" path="/var/lib/kubelet/pods/79db7c52-88b4-4ba5-99c9-56bec7efe7d6/volumes" Apr 24 16:58:32.363973 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:58:32.363944 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:59:46.545524 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:46.545493 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9"] Apr 24 16:59:46.548050 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:46.545793 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="main" containerID="cri-o://30fc3df909182642de5f90cbf617a6029f08023d3493fa4a84fc997a3bf7ced5" gracePeriod=30 Apr 24 16:59:46.548050 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:46.545871 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="tokenizer" containerID="cri-o://65b509f1bdb85e68e240c10b0b56c7aee0d443a91f802d1733479cfa90947248" gracePeriod=30 Apr 24 16:59:46.660810 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:46.660783 2563 generic.go:358] "Generic (PLEG): container finished" podID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerID="30fc3df909182642de5f90cbf617a6029f08023d3493fa4a84fc997a3bf7ced5" exitCode=0 Apr 24 16:59:46.660949 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:46.660851 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" event={"ID":"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c","Type":"ContainerDied","Data":"30fc3df909182642de5f90cbf617a6029f08023d3493fa4a84fc997a3bf7ced5"} Apr 24 16:59:47.667004 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.666973 2563 generic.go:358] "Generic (PLEG): container finished" podID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerID="65b509f1bdb85e68e240c10b0b56c7aee0d443a91f802d1733479cfa90947248" exitCode=0 Apr 24 16:59:47.667352 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.667042 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" event={"ID":"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c","Type":"ContainerDied","Data":"65b509f1bdb85e68e240c10b0b56c7aee0d443a91f802d1733479cfa90947248"} Apr 24 16:59:47.709027 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.709007 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:59:47.823659 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.823567 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-cache\") pod \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " Apr 24 16:59:47.823659 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.823609 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tls-certs\") pod \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " Apr 24 16:59:47.823876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.823683 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb5kb\" (UniqueName: \"kubernetes.io/projected/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kube-api-access-qb5kb\") pod \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " Apr 24 16:59:47.823876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.823734 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kserve-provision-location\") pod \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " Apr 24 16:59:47.823876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.823769 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-tmp\") pod \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " Apr 24 16:59:47.823876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.823794 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-uds\") pod \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\" (UID: \"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c\") " Apr 24 16:59:47.823876 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.823838 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" (UID: "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:59:47.824131 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.824057 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:59:47.824131 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.824078 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" (UID: "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:59:47.824131 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.824105 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" (UID: "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:59:47.824493 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.824471 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" (UID: "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:59:47.825781 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.825756 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" (UID: "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:59:47.825882 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.825787 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kube-api-access-qb5kb" (OuterVolumeSpecName: "kube-api-access-qb5kb") pod "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" (UID: "a1cd1b1d-4405-48dd-b7b8-d0a32691e53c"). InnerVolumeSpecName "kube-api-access-qb5kb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:59:47.924962 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.924936 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:59:47.924962 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.924959 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:59:47.925121 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.924969 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:59:47.925121 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.924977 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qb5kb\" (UniqueName: \"kubernetes.io/projected/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kube-api-access-qb5kb\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:59:47.925121 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:47.924986 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 16:59:48.672264 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:48.672219 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" event={"ID":"a1cd1b1d-4405-48dd-b7b8-d0a32691e53c","Type":"ContainerDied","Data":"90b4b16df34f28b355a3fd88157f7040eddf3b0fa92461381f1c26b39aae59e3"} Apr 24 16:59:48.672673 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:48.672284 2563 scope.go:117] "RemoveContainer" containerID="65b509f1bdb85e68e240c10b0b56c7aee0d443a91f802d1733479cfa90947248" Apr 24 16:59:48.672673 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:48.672234 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9" Apr 24 16:59:48.680192 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:48.680174 2563 scope.go:117] "RemoveContainer" containerID="30fc3df909182642de5f90cbf617a6029f08023d3493fa4a84fc997a3bf7ced5" Apr 24 16:59:48.687182 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:48.687164 2563 scope.go:117] "RemoveContainer" containerID="0147385a053a6bb6ebc5796fd8ff057dd179e35a6f4bf3df37fb515b991b01d1" Apr 24 16:59:48.689556 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:48.689532 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9"] Apr 24 16:59:48.695613 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:48.695585 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c5745f965lt9"] Apr 24 16:59:49.859805 ip-10-0-137-69 kubenswrapper[2563]: I0424 16:59:49.859766 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" path="/var/lib/kubelet/pods/a1cd1b1d-4405-48dd-b7b8-d0a32691e53c/volumes" Apr 24 17:00:01.110659 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110624 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm"] Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110912 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="storage-initializer" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110923 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="storage-initializer" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110936 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="tokenizer" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110942 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="tokenizer" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110956 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="main" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110961 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="main" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110974 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="storage-initializer" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110983 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="storage-initializer" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110992 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="tokenizer" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.110998 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="tokenizer" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.111005 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="main" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.111011 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="main" Apr 24 17:00:01.111051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.111057 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="tokenizer" Apr 24 17:00:01.111449 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.111065 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="tokenizer" Apr 24 17:00:01.111449 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.111073 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="79db7c52-88b4-4ba5-99c9-56bec7efe7d6" containerName="main" Apr 24 17:00:01.111449 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.111080 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1cd1b1d-4405-48dd-b7b8-d0a32691e53c" containerName="main" Apr 24 17:00:01.114155 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.114118 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.117533 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.117507 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 17:00:01.117668 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.117624 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 17:00:01.118467 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.118450 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-rm5tr\"" Apr 24 17:00:01.118554 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.118452 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 17:00:01.118554 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.118535 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 24 17:00:01.123642 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.123619 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm"] Apr 24 17:00:01.224802 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.224770 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.224960 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.224891 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.224960 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.224945 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.225057 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.224978 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.225057 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.225004 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.225057 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.225034 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrsr\" (UniqueName: \"kubernetes.io/projected/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kube-api-access-ngrsr\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326299 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326261 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326299 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326300 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326519 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326320 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326519 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326342 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326519 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326362 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrsr\" (UniqueName: \"kubernetes.io/projected/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kube-api-access-ngrsr\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326519 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326405 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326721 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326677 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326782 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326724 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326782 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326766 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.326858 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.326808 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.328742 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.328722 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.334717 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.334686 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrsr\" (UniqueName: \"kubernetes.io/projected/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kube-api-access-ngrsr\") pod \"router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.423807 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.423736 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:01.570090 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.570051 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm"] Apr 24 17:00:01.571744 ip-10-0-137-69 kubenswrapper[2563]: W0424 17:00:01.571713 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbafd5b3_e3b0_4f8e_8e57_5d6abd7542a8.slice/crio-da91678d5b8646765a253afdb26f55722eaec8df99dcff225ccdb0a21ae6ec45 WatchSource:0}: Error finding container da91678d5b8646765a253afdb26f55722eaec8df99dcff225ccdb0a21ae6ec45: Status 404 returned error can't find the container with id da91678d5b8646765a253afdb26f55722eaec8df99dcff225ccdb0a21ae6ec45 Apr 24 17:00:01.718551 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.718509 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" event={"ID":"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8","Type":"ContainerStarted","Data":"34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15"} Apr 24 17:00:01.718742 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:01.718559 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" event={"ID":"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8","Type":"ContainerStarted","Data":"da91678d5b8646765a253afdb26f55722eaec8df99dcff225ccdb0a21ae6ec45"} Apr 24 17:00:03.725735 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:03.725701 2563 generic.go:358] "Generic (PLEG): container finished" podID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerID="34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15" exitCode=0 Apr 24 17:00:03.726105 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:03.725775 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" event={"ID":"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8","Type":"ContainerDied","Data":"34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15"} Apr 24 17:00:04.730956 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:04.730925 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" event={"ID":"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8","Type":"ContainerStarted","Data":"d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf"} Apr 24 17:00:04.731359 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:04.730966 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" event={"ID":"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8","Type":"ContainerStarted","Data":"e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f"} Apr 24 17:00:04.731359 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:04.731176 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:04.754324 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:04.754274 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" podStartSLOduration=3.754255518 podStartE2EDuration="3.754255518s" podCreationTimestamp="2026-04-24 17:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:00:04.753506387 +0000 UTC m=+1263.424137300" watchObservedRunningTime="2026-04-24 17:00:04.754255518 +0000 UTC m=+1263.424886423" Apr 24 17:00:11.424659 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:11.424623 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:11.424659 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:11.424656 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:11.427323 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:11.427296 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:11.756237 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:11.756210 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:00:42.759687 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:00:42.759658 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:01:46.648483 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:46.648445 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm"] Apr 24 17:01:46.648906 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:46.648820 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="main" containerID="cri-o://e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f" gracePeriod=30 Apr 24 17:01:46.648906 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:46.648842 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="tokenizer" containerID="cri-o://d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf" gracePeriod=30 Apr 24 17:01:47.048163 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.048108 2563 generic.go:358] "Generic (PLEG): container finished" podID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerID="e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f" exitCode=0 Apr 24 17:01:47.048327 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.048179 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" event={"ID":"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8","Type":"ContainerDied","Data":"e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f"} Apr 24 17:01:47.895273 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.895251 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:01:47.933429 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933370 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tls-certs\") pod \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " Apr 24 17:01:47.933429 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933407 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-cache\") pod \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " Apr 24 17:01:47.933580 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933432 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngrsr\" (UniqueName: \"kubernetes.io/projected/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kube-api-access-ngrsr\") pod \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " Apr 24 17:01:47.933580 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933458 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-uds\") pod \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " Apr 24 17:01:47.933580 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933530 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kserve-provision-location\") pod \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " Apr 24 17:01:47.933580 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933575 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-tmp\") pod \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\" (UID: \"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8\") " Apr 24 17:01:47.933766 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933618 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" (UID: "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:01:47.933860 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933837 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:01:47.933956 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933843 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" (UID: "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:01:47.934037 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.933948 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" (UID: "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:01:47.934366 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.934343 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" (UID: "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:01:47.935482 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.935462 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kube-api-access-ngrsr" (OuterVolumeSpecName: "kube-api-access-ngrsr") pod "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" (UID: "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8"). InnerVolumeSpecName "kube-api-access-ngrsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:01:47.935580 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:47.935562 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" (UID: "bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:01:48.034496 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.034473 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngrsr\" (UniqueName: \"kubernetes.io/projected/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kube-api-access-ngrsr\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:01:48.034496 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.034497 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:01:48.034632 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.034507 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:01:48.034632 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.034516 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:01:48.034632 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.034526 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:01:48.053579 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.053555 2563 generic.go:358] "Generic (PLEG): container finished" podID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerID="d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf" exitCode=0 Apr 24 17:01:48.053688 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.053631 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" Apr 24 17:01:48.053688 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.053632 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" event={"ID":"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8","Type":"ContainerDied","Data":"d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf"} Apr 24 17:01:48.053688 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.053670 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm" event={"ID":"bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8","Type":"ContainerDied","Data":"da91678d5b8646765a253afdb26f55722eaec8df99dcff225ccdb0a21ae6ec45"} Apr 24 17:01:48.053688 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.053684 2563 scope.go:117] "RemoveContainer" containerID="d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf" Apr 24 17:01:48.061554 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.061537 2563 scope.go:117] "RemoveContainer" containerID="e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f" Apr 24 17:01:48.068305 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.068289 2563 scope.go:117] "RemoveContainer" containerID="34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15" Apr 24 17:01:48.075172 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.075153 2563 scope.go:117] "RemoveContainer" containerID="d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf" Apr 24 17:01:48.075465 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:01:48.075445 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf\": container with ID starting with d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf not found: ID does not exist" containerID="d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf" Apr 24 17:01:48.075541 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.075479 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf"} err="failed to get container status \"d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf\": rpc error: code = NotFound desc = could not find container \"d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf\": container with ID starting with d4c795d71f93ef8eaacdd9c14d92be43f1713e490aab40e8d8bfae402c8372cf not found: ID does not exist" Apr 24 17:01:48.075541 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.075502 2563 scope.go:117] "RemoveContainer" containerID="e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f" Apr 24 17:01:48.075779 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:01:48.075759 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f\": container with ID starting with e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f not found: ID does not exist" containerID="e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f" Apr 24 17:01:48.075850 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.075785 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f"} err="failed to get container status \"e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f\": rpc error: code = NotFound desc = could not find container \"e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f\": container with ID starting with e8475f08d31e7e4e382b5771296d732892b2f4e7d8e12373aaa7d126fcd4361f not found: ID does not exist" Apr 24 17:01:48.075850 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.075801 2563 scope.go:117] "RemoveContainer" containerID="34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15" Apr 24 17:01:48.075930 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.075898 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm"] Apr 24 17:01:48.076045 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:01:48.076029 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15\": container with ID starting with 34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15 not found: ID does not exist" containerID="34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15" Apr 24 17:01:48.076086 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.076050 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15"} err="failed to get container status \"34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15\": rpc error: code = NotFound desc = could not find container \"34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15\": container with ID starting with 34195c13c7bfead09951ef8d912bc53ddd60a84df45759bebc8d0f84c7211e15 not found: ID does not exist" Apr 24 17:01:48.081057 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:48.081037 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-84c765fcbc-mvpnm"] Apr 24 17:01:49.859984 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:49.859954 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" path="/var/lib/kubelet/pods/bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8/volumes" Apr 24 17:01:56.801994 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.801949 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc"] Apr 24 17:01:56.802435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.802276 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="tokenizer" Apr 24 17:01:56.802435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.802288 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="tokenizer" Apr 24 17:01:56.802435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.802302 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="storage-initializer" Apr 24 17:01:56.802435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.802308 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="storage-initializer" Apr 24 17:01:56.802435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.802321 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="main" Apr 24 17:01:56.802435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.802327 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="main" Apr 24 17:01:56.802435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.802379 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="main" Apr 24 17:01:56.802435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.802386 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbafd5b3-e3b0-4f8e-8e57-5d6abd7542a8" containerName="tokenizer" Apr 24 17:01:56.807121 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.807102 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:56.809341 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.809315 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 17:01:56.810014 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.809996 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-rjq89\"" Apr 24 17:01:56.810112 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.809996 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 17:01:56.810112 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.809997 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 24 17:01:56.810112 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.810070 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 17:01:56.816823 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.816803 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc"] Apr 24 17:01:56.899814 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.899783 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7394f44-4777-43d7-a60c-da767d1e5488-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:56.899814 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.899816 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7k8l\" (UniqueName: \"kubernetes.io/projected/f7394f44-4777-43d7-a60c-da767d1e5488-kube-api-access-h7k8l\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:56.899999 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.899836 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:56.899999 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.899934 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:56.899999 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.899980 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:56.900096 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:56.899998 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001095 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001063 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7394f44-4777-43d7-a60c-da767d1e5488-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001095 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001096 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7k8l\" (UniqueName: \"kubernetes.io/projected/f7394f44-4777-43d7-a60c-da767d1e5488-kube-api-access-h7k8l\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001298 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001114 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001298 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001168 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001298 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001210 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001298 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001236 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001589 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001567 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001654 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001605 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001654 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001627 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.001745 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.001661 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.003619 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.003598 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7394f44-4777-43d7-a60c-da767d1e5488-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.009184 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.009160 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7k8l\" (UniqueName: \"kubernetes.io/projected/f7394f44-4777-43d7-a60c-da767d1e5488-kube-api-access-h7k8l\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.117697 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.117617 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:57.246199 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.246130 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc"] Apr 24 17:01:57.248363 ip-10-0-137-69 kubenswrapper[2563]: W0424 17:01:57.248328 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7394f44_4777_43d7_a60c_da767d1e5488.slice/crio-fafcbfd0023a55f1fb8d4c5b0e704aa4b365f0667e748fa396d11e3526dbbba5 WatchSource:0}: Error finding container fafcbfd0023a55f1fb8d4c5b0e704aa4b365f0667e748fa396d11e3526dbbba5: Status 404 returned error can't find the container with id fafcbfd0023a55f1fb8d4c5b0e704aa4b365f0667e748fa396d11e3526dbbba5 Apr 24 17:01:57.250195 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:57.250177 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:01:58.086171 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:58.086118 2563 generic.go:358] "Generic (PLEG): container finished" podID="f7394f44-4777-43d7-a60c-da767d1e5488" containerID="4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70" exitCode=0 Apr 24 17:01:58.086560 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:58.086208 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" event={"ID":"f7394f44-4777-43d7-a60c-da767d1e5488","Type":"ContainerDied","Data":"4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70"} Apr 24 17:01:58.086560 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:58.086254 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" event={"ID":"f7394f44-4777-43d7-a60c-da767d1e5488","Type":"ContainerStarted","Data":"fafcbfd0023a55f1fb8d4c5b0e704aa4b365f0667e748fa396d11e3526dbbba5"} Apr 24 17:01:59.091091 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:59.091056 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" event={"ID":"f7394f44-4777-43d7-a60c-da767d1e5488","Type":"ContainerStarted","Data":"e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a"} Apr 24 17:01:59.091091 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:59.091092 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" event={"ID":"f7394f44-4777-43d7-a60c-da767d1e5488","Type":"ContainerStarted","Data":"771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c"} Apr 24 17:01:59.091529 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:59.091298 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:01:59.111400 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:01:59.111356 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" podStartSLOduration=3.11134356 podStartE2EDuration="3.11134356s" podCreationTimestamp="2026-04-24 17:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:01:59.110057605 +0000 UTC m=+1377.780688506" watchObservedRunningTime="2026-04-24 17:01:59.11134356 +0000 UTC m=+1377.781974462" Apr 24 17:02:07.117905 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:02:07.117874 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:02:07.118307 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:02:07.117919 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:02:07.120608 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:02:07.120582 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:02:08.120698 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:02:08.120669 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:02:29.123474 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:02:29.123445 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:05:34.981311 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:34.981281 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc"] Apr 24 17:05:34.983783 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:34.981586 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="main" containerID="cri-o://771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c" gracePeriod=30 Apr 24 17:05:34.983783 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:34.981618 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="tokenizer" containerID="cri-o://e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a" gracePeriod=30 Apr 24 17:05:35.784029 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:35.783995 2563 generic.go:358] "Generic (PLEG): container finished" podID="f7394f44-4777-43d7-a60c-da767d1e5488" containerID="771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c" exitCode=0 Apr 24 17:05:35.784228 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:35.784041 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" event={"ID":"f7394f44-4777-43d7-a60c-da767d1e5488","Type":"ContainerDied","Data":"771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c"} Apr 24 17:05:36.138401 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.138371 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:05:36.155687 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.155658 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-tmp\") pod \"f7394f44-4777-43d7-a60c-da767d1e5488\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " Apr 24 17:05:36.155772 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.155727 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7k8l\" (UniqueName: \"kubernetes.io/projected/f7394f44-4777-43d7-a60c-da767d1e5488-kube-api-access-h7k8l\") pod \"f7394f44-4777-43d7-a60c-da767d1e5488\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " Apr 24 17:05:36.155772 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.155758 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-cache\") pod \"f7394f44-4777-43d7-a60c-da767d1e5488\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " Apr 24 17:05:36.155846 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.155789 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7394f44-4777-43d7-a60c-da767d1e5488-tls-certs\") pod \"f7394f44-4777-43d7-a60c-da767d1e5488\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " Apr 24 17:05:36.155897 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.155864 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-uds\") pod \"f7394f44-4777-43d7-a60c-da767d1e5488\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " Apr 24 17:05:36.155935 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.155905 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-kserve-provision-location\") pod \"f7394f44-4777-43d7-a60c-da767d1e5488\" (UID: \"f7394f44-4777-43d7-a60c-da767d1e5488\") " Apr 24 17:05:36.156046 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.156018 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f7394f44-4777-43d7-a60c-da767d1e5488" (UID: "f7394f44-4777-43d7-a60c-da767d1e5488"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:05:36.156100 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.156031 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f7394f44-4777-43d7-a60c-da767d1e5488" (UID: "f7394f44-4777-43d7-a60c-da767d1e5488"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:05:36.156170 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.156125 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:05:36.156266 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.156218 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f7394f44-4777-43d7-a60c-da767d1e5488" (UID: "f7394f44-4777-43d7-a60c-da767d1e5488"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:05:36.156785 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.156757 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f7394f44-4777-43d7-a60c-da767d1e5488" (UID: "f7394f44-4777-43d7-a60c-da767d1e5488"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:05:36.157895 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.157867 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7394f44-4777-43d7-a60c-da767d1e5488-kube-api-access-h7k8l" (OuterVolumeSpecName: "kube-api-access-h7k8l") pod "f7394f44-4777-43d7-a60c-da767d1e5488" (UID: "f7394f44-4777-43d7-a60c-da767d1e5488"). InnerVolumeSpecName "kube-api-access-h7k8l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:05:36.157975 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.157905 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7394f44-4777-43d7-a60c-da767d1e5488-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f7394f44-4777-43d7-a60c-da767d1e5488" (UID: "f7394f44-4777-43d7-a60c-da767d1e5488"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:05:36.257564 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.257528 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7394f44-4777-43d7-a60c-da767d1e5488-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:05:36.257564 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.257558 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:05:36.257758 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.257586 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:05:36.257758 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.257603 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7k8l\" (UniqueName: \"kubernetes.io/projected/f7394f44-4777-43d7-a60c-da767d1e5488-kube-api-access-h7k8l\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:05:36.257758 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.257613 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7394f44-4777-43d7-a60c-da767d1e5488-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:05:36.789001 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.788965 2563 generic.go:358] "Generic (PLEG): container finished" podID="f7394f44-4777-43d7-a60c-da767d1e5488" containerID="e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a" exitCode=0 Apr 24 17:05:36.789184 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.789044 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" event={"ID":"f7394f44-4777-43d7-a60c-da767d1e5488","Type":"ContainerDied","Data":"e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a"} Apr 24 17:05:36.789184 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.789067 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" event={"ID":"f7394f44-4777-43d7-a60c-da767d1e5488","Type":"ContainerDied","Data":"fafcbfd0023a55f1fb8d4c5b0e704aa4b365f0667e748fa396d11e3526dbbba5"} Apr 24 17:05:36.789184 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.789082 2563 scope.go:117] "RemoveContainer" containerID="e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a" Apr 24 17:05:36.789184 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.789107 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc" Apr 24 17:05:36.797327 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.797303 2563 scope.go:117] "RemoveContainer" containerID="771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c" Apr 24 17:05:36.804032 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.804016 2563 scope.go:117] "RemoveContainer" containerID="4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70" Apr 24 17:05:36.811066 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.811026 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc"] Apr 24 17:05:36.811814 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.811794 2563 scope.go:117] "RemoveContainer" containerID="e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a" Apr 24 17:05:36.812071 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:05:36.812054 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a\": container with ID starting with e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a not found: ID does not exist" containerID="e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a" Apr 24 17:05:36.812129 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.812079 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a"} err="failed to get container status \"e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a\": rpc error: code = NotFound desc = could not find container \"e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a\": container with ID starting with e1f548c64491033999402454efd403f4415276489a855bb7cb5df64c1ad3253a not found: ID does not exist" Apr 24 17:05:36.812129 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.812096 2563 scope.go:117] "RemoveContainer" containerID="771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c" Apr 24 17:05:36.812378 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:05:36.812362 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c\": container with ID starting with 771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c not found: ID does not exist" containerID="771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c" Apr 24 17:05:36.812425 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.812388 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c"} err="failed to get container status \"771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c\": rpc error: code = NotFound desc = could not find container \"771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c\": container with ID starting with 771d90e2805c2d1e2701ad9436d9eb19b50ad25b3051093c4ccf797a3b87a34c not found: ID does not exist" Apr 24 17:05:36.812425 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.812403 2563 scope.go:117] "RemoveContainer" containerID="4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70" Apr 24 17:05:36.812623 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:05:36.812604 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70\": container with ID starting with 4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70 not found: ID does not exist" containerID="4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70" Apr 24 17:05:36.812672 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.812627 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70"} err="failed to get container status \"4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70\": rpc error: code = NotFound desc = could not find container \"4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70\": container with ID starting with 4dd4c9d5629cd6216f9529e95b8afe02ffc750d138fcfde33867a4514aa2fe70 not found: ID does not exist" Apr 24 17:05:36.812975 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:36.812958 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schewcgnc"] Apr 24 17:05:37.859412 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:37.859376 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" path="/var/lib/kubelet/pods/f7394f44-4777-43d7-a60c-da767d1e5488/volumes" Apr 24 17:05:43.240118 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240084 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm"] Apr 24 17:05:43.240604 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240524 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="storage-initializer" Apr 24 17:05:43.240604 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240540 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="storage-initializer" Apr 24 17:05:43.240604 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240554 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="main" Apr 24 17:05:43.240604 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240562 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="main" Apr 24 17:05:43.240604 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240578 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="tokenizer" Apr 24 17:05:43.240604 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240586 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="tokenizer" Apr 24 17:05:43.240915 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240668 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="tokenizer" Apr 24 17:05:43.240915 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.240680 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7394f44-4777-43d7-a60c-da767d1e5488" containerName="main" Apr 24 17:05:43.245675 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.245656 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.248163 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.248129 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-g6wtf\"" Apr 24 17:05:43.248515 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.248491 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 24 17:05:43.248803 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.248785 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 17:05:43.248854 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.248839 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 17:05:43.250818 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.250798 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 17:05:43.259424 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.259403 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm"] Apr 24 17:05:43.318541 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.318511 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be5d81-976c-4466-bc60-a53a45f886c3-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.318678 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.318583 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn5kd\" (UniqueName: \"kubernetes.io/projected/b6be5d81-976c-4466-bc60-a53a45f886c3-kube-api-access-fn5kd\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.318678 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.318627 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.318678 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.318658 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.318678 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.318675 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.318860 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.318691 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.419541 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.419492 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be5d81-976c-4466-bc60-a53a45f886c3-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.419752 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.419565 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn5kd\" (UniqueName: \"kubernetes.io/projected/b6be5d81-976c-4466-bc60-a53a45f886c3-kube-api-access-fn5kd\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.419752 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.419608 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.419752 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.419631 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.419752 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.419657 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.419752 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.419684 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.420043 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.420019 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.420043 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.420035 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.420184 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.420097 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.420245 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.420194 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.422101 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.422080 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be5d81-976c-4466-bc60-a53a45f886c3-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.427908 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.427886 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn5kd\" (UniqueName: \"kubernetes.io/projected/b6be5d81-976c-4466-bc60-a53a45f886c3-kube-api-access-fn5kd\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.555816 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.555731 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:43.680417 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.680383 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm"] Apr 24 17:05:43.683595 ip-10-0-137-69 kubenswrapper[2563]: W0424 17:05:43.683566 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6be5d81_976c_4466_bc60_a53a45f886c3.slice/crio-51a2b02125e251ba38eead86f14ea584a23ba8b450bf96195b48cf7a10146a3a WatchSource:0}: Error finding container 51a2b02125e251ba38eead86f14ea584a23ba8b450bf96195b48cf7a10146a3a: Status 404 returned error can't find the container with id 51a2b02125e251ba38eead86f14ea584a23ba8b450bf96195b48cf7a10146a3a Apr 24 17:05:43.812234 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.812154 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" event={"ID":"b6be5d81-976c-4466-bc60-a53a45f886c3","Type":"ContainerStarted","Data":"9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc"} Apr 24 17:05:43.812234 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:43.812192 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" event={"ID":"b6be5d81-976c-4466-bc60-a53a45f886c3","Type":"ContainerStarted","Data":"51a2b02125e251ba38eead86f14ea584a23ba8b450bf96195b48cf7a10146a3a"} Apr 24 17:05:44.816905 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:44.816867 2563 generic.go:358] "Generic (PLEG): container finished" podID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerID="9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc" exitCode=0 Apr 24 17:05:44.817360 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:44.816956 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" event={"ID":"b6be5d81-976c-4466-bc60-a53a45f886c3","Type":"ContainerDied","Data":"9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc"} Apr 24 17:05:45.822697 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:45.822664 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" event={"ID":"b6be5d81-976c-4466-bc60-a53a45f886c3","Type":"ContainerStarted","Data":"7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c"} Apr 24 17:05:45.822697 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:45.822703 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" event={"ID":"b6be5d81-976c-4466-bc60-a53a45f886c3","Type":"ContainerStarted","Data":"7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d"} Apr 24 17:05:45.823181 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:45.822829 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:45.845787 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:45.845714 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" podStartSLOduration=2.845695171 podStartE2EDuration="2.845695171s" podCreationTimestamp="2026-04-24 17:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:05:45.844486467 +0000 UTC m=+1604.515117392" watchObservedRunningTime="2026-04-24 17:05:45.845695171 +0000 UTC m=+1604.516326073" Apr 24 17:05:53.556171 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:53.556070 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:53.556171 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:53.556114 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:53.558648 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:53.558625 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:05:53.848033 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:05:53.847953 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:06:14.851778 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:06:14.851750 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:08:41.573413 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.573366 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm"] Apr 24 17:08:41.574053 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.573937 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="tokenizer" containerID="cri-o://7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c" gracePeriod=30 Apr 24 17:08:41.574131 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.574051 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="main" containerID="cri-o://7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d" gracePeriod=30 Apr 24 17:08:41.575327 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.575304 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc"] Apr 24 17:08:41.578640 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.578620 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.589419 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.589395 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 24 17:08:41.589525 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.589457 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-bdlld\"" Apr 24 17:08:41.620186 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.618917 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc"] Apr 24 17:08:41.677840 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.677808 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.678023 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.677872 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.678023 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.677970 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.678166 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.678021 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvpst\" (UniqueName: \"kubernetes.io/projected/84e41cce-eb45-47d6-8491-6ec2143a84ed-kube-api-access-dvpst\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.678166 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.678058 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.678166 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.678129 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.678294 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.678228 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.678294 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.678284 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/84e41cce-eb45-47d6-8491-6ec2143a84ed-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.678379 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.678326 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779491 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779453 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779664 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779514 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/84e41cce-eb45-47d6-8491-6ec2143a84ed-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779664 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779557 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779664 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779585 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779664 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779635 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779895 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779666 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779895 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779693 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvpst\" (UniqueName: \"kubernetes.io/projected/84e41cce-eb45-47d6-8491-6ec2143a84ed-kube-api-access-dvpst\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779895 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779729 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.779895 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779772 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.780104 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.779921 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.780104 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.780002 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.780324 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.780303 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/84e41cce-eb45-47d6-8491-6ec2143a84ed-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.780565 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.780542 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.780652 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.780604 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.782563 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.782516 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.782812 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.782791 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.787427 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.787398 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/84e41cce-eb45-47d6-8491-6ec2143a84ed-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.787510 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.787474 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvpst\" (UniqueName: \"kubernetes.io/projected/84e41cce-eb45-47d6-8491-6ec2143a84ed-kube-api-access-dvpst\") pod \"router-gateway-2-openshift-default-6866b85949-6bxxc\" (UID: \"84e41cce-eb45-47d6-8491-6ec2143a84ed\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:41.900364 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:41.900284 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:42.038659 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.038623 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc"] Apr 24 17:08:42.041468 ip-10-0-137-69 kubenswrapper[2563]: W0424 17:08:42.041441 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84e41cce_eb45_47d6_8491_6ec2143a84ed.slice/crio-0c31fcdffc8070442a85832fbf16477449e481abd64a2db547d3ba9a41db1671 WatchSource:0}: Error finding container 0c31fcdffc8070442a85832fbf16477449e481abd64a2db547d3ba9a41db1671: Status 404 returned error can't find the container with id 0c31fcdffc8070442a85832fbf16477449e481abd64a2db547d3ba9a41db1671 Apr 24 17:08:42.043236 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.043220 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:08:42.359449 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.359415 2563 generic.go:358] "Generic (PLEG): container finished" podID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerID="7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d" exitCode=0 Apr 24 17:08:42.359658 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.359483 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" event={"ID":"b6be5d81-976c-4466-bc60-a53a45f886c3","Type":"ContainerDied","Data":"7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d"} Apr 24 17:08:42.360664 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.360640 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" event={"ID":"84e41cce-eb45-47d6-8491-6ec2143a84ed","Type":"ContainerStarted","Data":"0c31fcdffc8070442a85832fbf16477449e481abd64a2db547d3ba9a41db1671"} Apr 24 17:08:42.821848 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.821820 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:08:42.991751 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.991714 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-kserve-provision-location\") pod \"b6be5d81-976c-4466-bc60-a53a45f886c3\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " Apr 24 17:08:42.991965 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.991823 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be5d81-976c-4466-bc60-a53a45f886c3-tls-certs\") pod \"b6be5d81-976c-4466-bc60-a53a45f886c3\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " Apr 24 17:08:42.991965 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.991879 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-tmp\") pod \"b6be5d81-976c-4466-bc60-a53a45f886c3\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " Apr 24 17:08:42.991965 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.991912 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-uds\") pod \"b6be5d81-976c-4466-bc60-a53a45f886c3\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " Apr 24 17:08:42.991965 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.991949 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-cache\") pod \"b6be5d81-976c-4466-bc60-a53a45f886c3\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " Apr 24 17:08:42.992213 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.991976 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn5kd\" (UniqueName: \"kubernetes.io/projected/b6be5d81-976c-4466-bc60-a53a45f886c3-kube-api-access-fn5kd\") pod \"b6be5d81-976c-4466-bc60-a53a45f886c3\" (UID: \"b6be5d81-976c-4466-bc60-a53a45f886c3\") " Apr 24 17:08:42.992271 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.992244 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b6be5d81-976c-4466-bc60-a53a45f886c3" (UID: "b6be5d81-976c-4466-bc60-a53a45f886c3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:08:42.992355 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.992323 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b6be5d81-976c-4466-bc60-a53a45f886c3" (UID: "b6be5d81-976c-4466-bc60-a53a45f886c3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:08:42.992355 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.992339 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b6be5d81-976c-4466-bc60-a53a45f886c3" (UID: "b6be5d81-976c-4466-bc60-a53a45f886c3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:08:42.992683 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.992656 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b6be5d81-976c-4466-bc60-a53a45f886c3" (UID: "b6be5d81-976c-4466-bc60-a53a45f886c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:08:42.994870 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.994839 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6be5d81-976c-4466-bc60-a53a45f886c3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b6be5d81-976c-4466-bc60-a53a45f886c3" (UID: "b6be5d81-976c-4466-bc60-a53a45f886c3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:08:42.995101 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:42.995078 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6be5d81-976c-4466-bc60-a53a45f886c3-kube-api-access-fn5kd" (OuterVolumeSpecName: "kube-api-access-fn5kd") pod "b6be5d81-976c-4466-bc60-a53a45f886c3" (UID: "b6be5d81-976c-4466-bc60-a53a45f886c3"). InnerVolumeSpecName "kube-api-access-fn5kd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:08:43.092811 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.092779 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:08:43.092811 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.092810 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be5d81-976c-4466-bc60-a53a45f886c3-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:08:43.092988 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.092822 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:08:43.092988 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.092831 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:08:43.092988 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.092839 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6be5d81-976c-4466-bc60-a53a45f886c3-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:08:43.092988 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.092847 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fn5kd\" (UniqueName: \"kubernetes.io/projected/b6be5d81-976c-4466-bc60-a53a45f886c3-kube-api-access-fn5kd\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:08:43.366538 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.366450 2563 generic.go:358] "Generic (PLEG): container finished" podID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerID="7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c" exitCode=0 Apr 24 17:08:43.366538 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.366517 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" event={"ID":"b6be5d81-976c-4466-bc60-a53a45f886c3","Type":"ContainerDied","Data":"7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c"} Apr 24 17:08:43.366538 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.366535 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" Apr 24 17:08:43.366815 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.366550 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm" event={"ID":"b6be5d81-976c-4466-bc60-a53a45f886c3","Type":"ContainerDied","Data":"51a2b02125e251ba38eead86f14ea584a23ba8b450bf96195b48cf7a10146a3a"} Apr 24 17:08:43.366815 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.366570 2563 scope.go:117] "RemoveContainer" containerID="7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c" Apr 24 17:08:43.376643 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.376623 2563 scope.go:117] "RemoveContainer" containerID="7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d" Apr 24 17:08:43.384649 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.384620 2563 scope.go:117] "RemoveContainer" containerID="9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc" Apr 24 17:08:43.392943 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.392923 2563 scope.go:117] "RemoveContainer" containerID="7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c" Apr 24 17:08:43.393337 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:08:43.393314 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c\": container with ID starting with 7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c not found: ID does not exist" containerID="7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c" Apr 24 17:08:43.393427 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.393348 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c"} err="failed to get container status \"7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c\": rpc error: code = NotFound desc = could not find container \"7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c\": container with ID starting with 7e0ae06275cd3e20d25810beaf9ee34668cf9aeca10878043e3189c3fbf6a12c not found: ID does not exist" Apr 24 17:08:43.393427 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.393374 2563 scope.go:117] "RemoveContainer" containerID="7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d" Apr 24 17:08:43.393650 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:08:43.393621 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d\": container with ID starting with 7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d not found: ID does not exist" containerID="7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d" Apr 24 17:08:43.393692 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.393661 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d"} err="failed to get container status \"7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d\": rpc error: code = NotFound desc = could not find container \"7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d\": container with ID starting with 7ef6fb4589524e173ac474408811ac2a797128a8e1d242ee5293a2955061033d not found: ID does not exist" Apr 24 17:08:43.393692 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.393685 2563 scope.go:117] "RemoveContainer" containerID="9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc" Apr 24 17:08:43.393929 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:08:43.393907 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc\": container with ID starting with 9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc not found: ID does not exist" containerID="9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc" Apr 24 17:08:43.394024 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.393958 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc"} err="failed to get container status \"9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc\": rpc error: code = NotFound desc = could not find container \"9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc\": container with ID starting with 9f436d7bee8fe4143b7a1de1832335240b0ca4b8eb22399e4673788e5b0b63cc not found: ID does not exist" Apr 24 17:08:43.398203 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.398181 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm"] Apr 24 17:08:43.407752 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.407731 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-59978f5ncm"] Apr 24 17:08:43.861578 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:43.861545 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" path="/var/lib/kubelet/pods/b6be5d81-976c-4466-bc60-a53a45f886c3/volumes" Apr 24 17:08:44.451911 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:44.451877 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 17:08:44.451995 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:44.451965 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 17:08:44.452038 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:44.451995 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 24 17:08:45.375063 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:45.375020 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" event={"ID":"84e41cce-eb45-47d6-8491-6ec2143a84ed","Type":"ContainerStarted","Data":"b5eceb80f0c648b045a94b46add25d7e984b963cdc2e0ff91bd19d0bac081aeb"} Apr 24 17:08:45.395785 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:45.395734 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" podStartSLOduration=1.987387382 podStartE2EDuration="4.395716964s" podCreationTimestamp="2026-04-24 17:08:41 +0000 UTC" firstStartedPulling="2026-04-24 17:08:42.043342326 +0000 UTC m=+1780.713973206" lastFinishedPulling="2026-04-24 17:08:44.451671908 +0000 UTC m=+1783.122302788" observedRunningTime="2026-04-24 17:08:45.39370314 +0000 UTC m=+1784.064334041" watchObservedRunningTime="2026-04-24 17:08:45.395716964 +0000 UTC m=+1784.066347869" Apr 24 17:08:45.901368 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:45.901343 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:45.902508 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:45.902475 2563 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" podUID="84e41cce-eb45-47d6-8491-6ec2143a84ed" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.37:15021/healthz/ready\": dial tcp 10.132.0.37:15021: connect: connection refused" Apr 24 17:08:46.901349 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:46.901300 2563 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" podUID="84e41cce-eb45-47d6-8491-6ec2143a84ed" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.37:15021/healthz/ready\": dial tcp 10.132.0.37:15021: connect: connection refused" Apr 24 17:08:47.904531 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:47.904503 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:48.386852 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:48.386815 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:48.387849 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:48.387831 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6bxxc" Apr 24 17:08:52.803289 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803216 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf"] Apr 24 17:08:52.803627 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803547 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="storage-initializer" Apr 24 17:08:52.803627 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803563 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="storage-initializer" Apr 24 17:08:52.803627 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803578 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="tokenizer" Apr 24 17:08:52.803627 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803584 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="tokenizer" Apr 24 17:08:52.803627 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803591 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="main" Apr 24 17:08:52.803627 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803597 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="main" Apr 24 17:08:52.803814 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803641 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="main" Apr 24 17:08:52.803814 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.803647 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6be5d81-976c-4466-bc60-a53a45f886c3" containerName="tokenizer" Apr 24 17:08:52.806728 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.806706 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:52.809639 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.809617 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 17:08:52.809750 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.809647 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-x97c8\"" Apr 24 17:08:52.809750 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.809647 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 24 17:08:52.816740 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.816712 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf"] Apr 24 17:08:52.975483 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.975439 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:52.975668 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.975498 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:52.975668 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.975549 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/527ef463-067b-4d9c-9e0b-2060f98fe204-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:52.975668 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.975581 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:52.975668 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.975622 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:52.975668 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:52.975646 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwtwk\" (UniqueName: \"kubernetes.io/projected/527ef463-067b-4d9c-9e0b-2060f98fe204-kube-api-access-qwtwk\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076260 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076172 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076260 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076219 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwtwk\" (UniqueName: \"kubernetes.io/projected/527ef463-067b-4d9c-9e0b-2060f98fe204-kube-api-access-qwtwk\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076485 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076313 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076485 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076342 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076485 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076382 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/527ef463-067b-4d9c-9e0b-2060f98fe204-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076485 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076417 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076686 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076623 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076721 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076690 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076774 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076751 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.076819 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.076806 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.078847 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.078824 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/527ef463-067b-4d9c-9e0b-2060f98fe204-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.084403 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.084381 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwtwk\" (UniqueName: \"kubernetes.io/projected/527ef463-067b-4d9c-9e0b-2060f98fe204-kube-api-access-qwtwk\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.116750 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.116726 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:53.249259 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.249227 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf"] Apr 24 17:08:53.251688 ip-10-0-137-69 kubenswrapper[2563]: W0424 17:08:53.251659 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod527ef463_067b_4d9c_9e0b_2060f98fe204.slice/crio-6cbe36d9dee268360c54d0c0a08335d9eaeb27dec415b68246fb56670a6a0207 WatchSource:0}: Error finding container 6cbe36d9dee268360c54d0c0a08335d9eaeb27dec415b68246fb56670a6a0207: Status 404 returned error can't find the container with id 6cbe36d9dee268360c54d0c0a08335d9eaeb27dec415b68246fb56670a6a0207 Apr 24 17:08:53.405635 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.405556 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" event={"ID":"527ef463-067b-4d9c-9e0b-2060f98fe204","Type":"ContainerStarted","Data":"c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187"} Apr 24 17:08:53.405635 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:53.405595 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" event={"ID":"527ef463-067b-4d9c-9e0b-2060f98fe204","Type":"ContainerStarted","Data":"6cbe36d9dee268360c54d0c0a08335d9eaeb27dec415b68246fb56670a6a0207"} Apr 24 17:08:54.409658 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:54.409573 2563 generic.go:358] "Generic (PLEG): container finished" podID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerID="c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187" exitCode=0 Apr 24 17:08:54.410071 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:54.409663 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" event={"ID":"527ef463-067b-4d9c-9e0b-2060f98fe204","Type":"ContainerDied","Data":"c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187"} Apr 24 17:08:55.415165 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:55.415108 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" event={"ID":"527ef463-067b-4d9c-9e0b-2060f98fe204","Type":"ContainerStarted","Data":"b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379"} Apr 24 17:08:55.415165 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:55.415168 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" event={"ID":"527ef463-067b-4d9c-9e0b-2060f98fe204","Type":"ContainerStarted","Data":"30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be"} Apr 24 17:08:55.415672 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:55.415208 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:08:55.437860 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:08:55.437772 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" podStartSLOduration=3.437753408 podStartE2EDuration="3.437753408s" podCreationTimestamp="2026-04-24 17:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:08:55.435500493 +0000 UTC m=+1794.106131407" watchObservedRunningTime="2026-04-24 17:08:55.437753408 +0000 UTC m=+1794.108384311" Apr 24 17:09:03.117769 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:09:03.117734 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:09:03.117769 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:09:03.117773 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:09:03.120517 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:09:03.120492 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:09:03.441546 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:09:03.441521 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:09:24.445688 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:09:24.445660 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:12:13.773648 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:13.773613 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf"] Apr 24 17:12:13.774479 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:13.774020 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="main" containerID="cri-o://30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be" gracePeriod=30 Apr 24 17:12:13.774479 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:13.774057 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="tokenizer" containerID="cri-o://b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379" gracePeriod=30 Apr 24 17:12:14.024812 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:14.024721 2563 generic.go:358] "Generic (PLEG): container finished" podID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerID="30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be" exitCode=0 Apr 24 17:12:14.024812 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:14.024772 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" event={"ID":"527ef463-067b-4d9c-9e0b-2060f98fe204","Type":"ContainerDied","Data":"30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be"} Apr 24 17:12:14.444331 ip-10-0-137-69 kubenswrapper[2563]: W0424 17:12:14.444308 2563 logging.go:55] [core] [Channel #701 SubChannel #702]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.38:9003", ServerName: "10.132.0.38:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.38:9003: connect: connection refused" Apr 24 17:12:14.929823 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:14.929802 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:12:15.029182 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.029086 2563 generic.go:358] "Generic (PLEG): container finished" podID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerID="b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379" exitCode=0 Apr 24 17:12:15.029182 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.029163 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" event={"ID":"527ef463-067b-4d9c-9e0b-2060f98fe204","Type":"ContainerDied","Data":"b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379"} Apr 24 17:12:15.029182 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.029174 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" Apr 24 17:12:15.029412 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.029194 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" event={"ID":"527ef463-067b-4d9c-9e0b-2060f98fe204","Type":"ContainerDied","Data":"6cbe36d9dee268360c54d0c0a08335d9eaeb27dec415b68246fb56670a6a0207"} Apr 24 17:12:15.029412 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.029210 2563 scope.go:117] "RemoveContainer" containerID="b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379" Apr 24 17:12:15.035830 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.035802 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-uds\") pod \"527ef463-067b-4d9c-9e0b-2060f98fe204\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " Apr 24 17:12:15.035931 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.035846 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-cache\") pod \"527ef463-067b-4d9c-9e0b-2060f98fe204\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " Apr 24 17:12:15.035931 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.035890 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-tmp\") pod \"527ef463-067b-4d9c-9e0b-2060f98fe204\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " Apr 24 17:12:15.035931 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.035908 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/527ef463-067b-4d9c-9e0b-2060f98fe204-tls-certs\") pod \"527ef463-067b-4d9c-9e0b-2060f98fe204\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " Apr 24 17:12:15.036082 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.035949 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwtwk\" (UniqueName: \"kubernetes.io/projected/527ef463-067b-4d9c-9e0b-2060f98fe204-kube-api-access-qwtwk\") pod \"527ef463-067b-4d9c-9e0b-2060f98fe204\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " Apr 24 17:12:15.036082 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.035986 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-kserve-provision-location\") pod \"527ef463-067b-4d9c-9e0b-2060f98fe204\" (UID: \"527ef463-067b-4d9c-9e0b-2060f98fe204\") " Apr 24 17:12:15.036197 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.036079 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "527ef463-067b-4d9c-9e0b-2060f98fe204" (UID: "527ef463-067b-4d9c-9e0b-2060f98fe204"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:12:15.036197 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.036097 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "527ef463-067b-4d9c-9e0b-2060f98fe204" (UID: "527ef463-067b-4d9c-9e0b-2060f98fe204"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:12:15.036306 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.036246 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-cache\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:12:15.036306 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.036260 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-uds\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:12:15.036460 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.036351 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "527ef463-067b-4d9c-9e0b-2060f98fe204" (UID: "527ef463-067b-4d9c-9e0b-2060f98fe204"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:12:15.036802 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.036779 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "527ef463-067b-4d9c-9e0b-2060f98fe204" (UID: "527ef463-067b-4d9c-9e0b-2060f98fe204"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:12:15.036949 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.036932 2563 scope.go:117] "RemoveContainer" containerID="30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be" Apr 24 17:12:15.038314 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.038292 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527ef463-067b-4d9c-9e0b-2060f98fe204-kube-api-access-qwtwk" (OuterVolumeSpecName: "kube-api-access-qwtwk") pod "527ef463-067b-4d9c-9e0b-2060f98fe204" (UID: "527ef463-067b-4d9c-9e0b-2060f98fe204"). InnerVolumeSpecName "kube-api-access-qwtwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:12:15.038415 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.038397 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527ef463-067b-4d9c-9e0b-2060f98fe204-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "527ef463-067b-4d9c-9e0b-2060f98fe204" (UID: "527ef463-067b-4d9c-9e0b-2060f98fe204"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:12:15.051734 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.051715 2563 scope.go:117] "RemoveContainer" containerID="c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187" Apr 24 17:12:15.058907 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.058875 2563 scope.go:117] "RemoveContainer" containerID="b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379" Apr 24 17:12:15.059146 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:12:15.059116 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379\": container with ID starting with b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379 not found: ID does not exist" containerID="b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379" Apr 24 17:12:15.059207 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.059161 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379"} err="failed to get container status \"b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379\": rpc error: code = NotFound desc = could not find container \"b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379\": container with ID starting with b3cf1c7a34a0722ef28ac53df3112c8fa3b416f19cae7487813ef75e3ff23379 not found: ID does not exist" Apr 24 17:12:15.059207 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.059180 2563 scope.go:117] "RemoveContainer" containerID="30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be" Apr 24 17:12:15.059457 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:12:15.059441 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be\": container with ID starting with 30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be not found: ID does not exist" containerID="30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be" Apr 24 17:12:15.059502 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.059462 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be"} err="failed to get container status \"30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be\": rpc error: code = NotFound desc = could not find container \"30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be\": container with ID starting with 30a28fd8f5bd1f5d34c9201650f1440502d539482e76fbe12c3ed492ddac07be not found: ID does not exist" Apr 24 17:12:15.059502 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.059487 2563 scope.go:117] "RemoveContainer" containerID="c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187" Apr 24 17:12:15.059721 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:12:15.059704 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187\": container with ID starting with c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187 not found: ID does not exist" containerID="c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187" Apr 24 17:12:15.059780 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.059740 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187"} err="failed to get container status \"c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187\": rpc error: code = NotFound desc = could not find container \"c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187\": container with ID starting with c842ef5a54c9e11da952c9d0a9d4dbc580e28d118a514255bae355d3fbfa5187 not found: ID does not exist" Apr 24 17:12:15.136813 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.136788 2563 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-tokenizer-tmp\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:12:15.136813 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.136810 2563 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/527ef463-067b-4d9c-9e0b-2060f98fe204-tls-certs\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:12:15.136945 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.136821 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwtwk\" (UniqueName: \"kubernetes.io/projected/527ef463-067b-4d9c-9e0b-2060f98fe204-kube-api-access-qwtwk\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:12:15.136945 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.136830 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/527ef463-067b-4d9c-9e0b-2060f98fe204-kserve-provision-location\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:12:15.350756 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.350728 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf"] Apr 24 17:12:15.353038 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.353015 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf"] Apr 24 17:12:15.444560 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.444518 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5df45d567t29tf" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.38:9003\" within 1s: context deadline exceeded" Apr 24 17:12:15.860157 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:12:15.860109 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" path="/var/lib/kubelet/pods/527ef463-067b-4d9c-9e0b-2060f98fe204/volumes" Apr 24 17:13:34.693303 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693215 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8"] Apr 24 17:13:34.693741 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693497 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="main" Apr 24 17:13:34.693741 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693507 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="main" Apr 24 17:13:34.693741 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693526 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="tokenizer" Apr 24 17:13:34.693741 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693532 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="tokenizer" Apr 24 17:13:34.693741 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693537 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="storage-initializer" Apr 24 17:13:34.693741 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693543 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="storage-initializer" Apr 24 17:13:34.693741 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693588 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="tokenizer" Apr 24 17:13:34.693741 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.693596 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="527ef463-067b-4d9c-9e0b-2060f98fe204" containerName="main" Apr 24 17:13:34.696843 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.696819 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.699585 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.699554 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-hqqcv\"" Apr 24 17:13:34.700388 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.700363 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q89jh\"" Apr 24 17:13:34.700527 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.700371 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 17:13:34.710855 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.710824 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8"] Apr 24 17:13:34.849496 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.849460 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqvd7\" (UniqueName: \"kubernetes.io/projected/a31a70b8-65d8-418e-aa56-02717c6e7e73-kube-api-access-wqvd7\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.849672 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.849516 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.849672 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.849640 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a31a70b8-65d8-418e-aa56-02717c6e7e73-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.849750 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.849679 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.849750 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.849721 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.849860 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.849772 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951127 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951048 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951127 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951089 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951337 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951153 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqvd7\" (UniqueName: \"kubernetes.io/projected/a31a70b8-65d8-418e-aa56-02717c6e7e73-kube-api-access-wqvd7\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951337 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951178 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951337 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951215 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a31a70b8-65d8-418e-aa56-02717c6e7e73-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951638 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951506 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951638 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951525 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951638 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951558 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951638 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951592 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.951855 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.951775 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a31a70b8-65d8-418e-aa56-02717c6e7e73-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.953563 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.953543 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a31a70b8-65d8-418e-aa56-02717c6e7e73-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:34.959229 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:34.959209 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqvd7\" (UniqueName: \"kubernetes.io/projected/a31a70b8-65d8-418e-aa56-02717c6e7e73-kube-api-access-wqvd7\") pod \"stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8\" (UID: \"a31a70b8-65d8-418e-aa56-02717c6e7e73\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:35.013338 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:35.013302 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:35.143601 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:35.143427 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8"] Apr 24 17:13:35.146357 ip-10-0-137-69 kubenswrapper[2563]: W0424 17:13:35.146329 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda31a70b8_65d8_418e_aa56_02717c6e7e73.slice/crio-29613425f96d8288df28a587ec4509fbc120040d8ab9f99bdf1f1c46e47b8a77 WatchSource:0}: Error finding container 29613425f96d8288df28a587ec4509fbc120040d8ab9f99bdf1f1c46e47b8a77: Status 404 returned error can't find the container with id 29613425f96d8288df28a587ec4509fbc120040d8ab9f99bdf1f1c46e47b8a77 Apr 24 17:13:35.274065 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:35.274026 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" event={"ID":"a31a70b8-65d8-418e-aa56-02717c6e7e73","Type":"ContainerStarted","Data":"92af2fde0f62e107fc682a12f873b950bfee813bb27dfbc16a6a4df4022e9ab4"} Apr 24 17:13:35.274065 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:35.274065 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" event={"ID":"a31a70b8-65d8-418e-aa56-02717c6e7e73","Type":"ContainerStarted","Data":"29613425f96d8288df28a587ec4509fbc120040d8ab9f99bdf1f1c46e47b8a77"} Apr 24 17:13:36.278304 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:36.278264 2563 generic.go:358] "Generic (PLEG): container finished" podID="a31a70b8-65d8-418e-aa56-02717c6e7e73" containerID="92af2fde0f62e107fc682a12f873b950bfee813bb27dfbc16a6a4df4022e9ab4" exitCode=0 Apr 24 17:13:36.278700 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:36.278314 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" event={"ID":"a31a70b8-65d8-418e-aa56-02717c6e7e73","Type":"ContainerDied","Data":"92af2fde0f62e107fc682a12f873b950bfee813bb27dfbc16a6a4df4022e9ab4"} Apr 24 17:13:37.283875 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:37.283840 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" event={"ID":"a31a70b8-65d8-418e-aa56-02717c6e7e73","Type":"ContainerStarted","Data":"376d9725f52d70ae6d122c0e798091b6bb876f1d9da95fa7e22c8c912417b6f5"} Apr 24 17:13:37.283875 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:37.283878 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" event={"ID":"a31a70b8-65d8-418e-aa56-02717c6e7e73","Type":"ContainerStarted","Data":"78788a144f0e419cf41c1b614f9323139e0b498578f82daed7c94cfa64e3dd0d"} Apr 24 17:13:37.284318 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:37.284007 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:37.308248 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:37.308197 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" podStartSLOduration=3.308178433 podStartE2EDuration="3.308178433s" podCreationTimestamp="2026-04-24 17:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:13:37.306177566 +0000 UTC m=+2075.976808467" watchObservedRunningTime="2026-04-24 17:13:37.308178433 +0000 UTC m=+2075.978809337" Apr 24 17:13:45.014406 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:45.014365 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:45.014406 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:45.014411 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:45.016843 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:45.016818 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:13:45.311436 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:13:45.311360 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:14:06.314527 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:14:06.314495 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8" Apr 24 17:28:48.276356 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.276278 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth"] Apr 24 17:28:48.278810 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.276551 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" podUID="a8fffed1-71d1-4b6e-be08-88db6fedf6a7" containerName="manager" containerID="cri-o://ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2" gracePeriod=30 Apr 24 17:28:48.520097 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.520073 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 17:28:48.616836 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.616763 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtv2k\" (UniqueName: \"kubernetes.io/projected/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-kube-api-access-vtv2k\") pod \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " Apr 24 17:28:48.616836 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.616800 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert\") pod \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\" (UID: \"a8fffed1-71d1-4b6e-be08-88db6fedf6a7\") " Apr 24 17:28:48.618736 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.618712 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert" (OuterVolumeSpecName: "cert") pod "a8fffed1-71d1-4b6e-be08-88db6fedf6a7" (UID: "a8fffed1-71d1-4b6e-be08-88db6fedf6a7"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:28:48.618840 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.618757 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-kube-api-access-vtv2k" (OuterVolumeSpecName: "kube-api-access-vtv2k") pod "a8fffed1-71d1-4b6e-be08-88db6fedf6a7" (UID: "a8fffed1-71d1-4b6e-be08-88db6fedf6a7"). InnerVolumeSpecName "kube-api-access-vtv2k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:28:48.717309 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.717281 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtv2k\" (UniqueName: \"kubernetes.io/projected/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-kube-api-access-vtv2k\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:28:48.717309 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:48.717306 2563 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8fffed1-71d1-4b6e-be08-88db6fedf6a7-cert\") on node \"ip-10-0-137-69.ec2.internal\" DevicePath \"\"" Apr 24 17:28:49.113768 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.113736 2563 generic.go:358] "Generic (PLEG): container finished" podID="a8fffed1-71d1-4b6e-be08-88db6fedf6a7" containerID="ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2" exitCode=0 Apr 24 17:28:49.113918 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.113782 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" event={"ID":"a8fffed1-71d1-4b6e-be08-88db6fedf6a7","Type":"ContainerDied","Data":"ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2"} Apr 24 17:28:49.113918 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.113806 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" event={"ID":"a8fffed1-71d1-4b6e-be08-88db6fedf6a7","Type":"ContainerDied","Data":"769ca6e88bd01aef828472e419c34b36b4bac6c161c3a603a48ee54ee56cedf0"} Apr 24 17:28:49.113918 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.113810 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth" Apr 24 17:28:49.113918 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.113882 2563 scope.go:117] "RemoveContainer" containerID="ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2" Apr 24 17:28:49.122375 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.122358 2563 scope.go:117] "RemoveContainer" containerID="ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2" Apr 24 17:28:49.122641 ip-10-0-137-69 kubenswrapper[2563]: E0424 17:28:49.122622 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2\": container with ID starting with ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2 not found: ID does not exist" containerID="ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2" Apr 24 17:28:49.122693 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.122648 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2"} err="failed to get container status \"ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2\": rpc error: code = NotFound desc = could not find container \"ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2\": container with ID starting with ef2d8c21159a6e62a23aa5c45f5442088e7fee55746ba8985e4caea54959a9f2 not found: ID does not exist" Apr 24 17:28:49.133743 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.133721 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth"] Apr 24 17:28:49.136452 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.136429 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-5dcd86f4cc-8gjth"] Apr 24 17:28:49.859263 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:28:49.859210 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8fffed1-71d1-4b6e-be08-88db6fedf6a7" path="/var/lib/kubelet/pods/a8fffed1-71d1-4b6e-be08-88db6fedf6a7/volumes" Apr 24 17:29:34.305174 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:34.305127 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:34.359090 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:34.359038 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:34.370416 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:34.370394 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:34.379798 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:34.379781 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:35.349393 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:35.349366 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:35.393868 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:35.393844 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:35.405573 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:35.405548 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:35.415113 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:35.415095 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:36.378020 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:36.377961 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:36.421809 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:36.421784 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:36.433444 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:36.433423 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:36.440909 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:36.440895 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:37.376844 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:37.376816 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:37.417091 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:37.417070 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:37.429388 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:37.429363 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:37.437993 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:37.437960 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:38.387055 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:38.387008 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:38.425984 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:38.425955 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:38.436524 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:38.436504 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:38.445164 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:38.445128 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:39.382127 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:39.382099 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:39.424345 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:39.424322 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:39.434949 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:39.434927 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:39.443345 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:39.443329 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:40.376679 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:40.376651 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:40.420481 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:40.420456 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:40.433119 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:40.433095 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:40.441540 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:40.441520 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:41.380539 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:41.380510 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:41.426681 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:41.426650 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:41.438976 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:41.438949 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:41.448218 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:41.448195 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:42.384710 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:42.384671 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:42.424544 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:42.424514 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:42.436051 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:42.436024 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:42.443866 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:42.443846 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:43.394048 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:43.393992 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:43.436405 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:43.436381 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:43.447515 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:43.447490 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:43.455733 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:43.455710 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:44.414765 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:44.414737 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:44.456213 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:44.456191 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:44.466744 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:44.466721 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:44.474626 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:44.474610 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:45.438768 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:45.438739 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:45.479676 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:45.479650 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:45.490778 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:45.490758 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:45.499010 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:45.498990 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:46.465511 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:46.465486 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:46.510328 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:46.510305 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:46.521359 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:46.521338 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:46.530699 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:46.530658 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:47.518818 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:47.518790 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6bxxc_84e41cce-eb45-47d6-8491-6ec2143a84ed/istio-proxy/0.log" Apr 24 17:29:47.562983 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:47.562959 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/main/0.log" Apr 24 17:29:47.574591 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:47.574564 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/tokenizer/0.log" Apr 24 17:29:47.582404 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:47.582380 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-router-scheduler-6c85574b4f-hkzx8_a31a70b8-65d8-418e-aa56-02717c6e7e73/storage-initializer/0.log" Apr 24 17:29:48.579492 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:48.579459 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8pfrd_96a57a22-0f7f-4408-bbaf-0671067e8c5d/discovery/0.log" Apr 24 17:29:49.368834 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:49.368803 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8pfrd_96a57a22-0f7f-4408-bbaf-0671067e8c5d/discovery/0.log" Apr 24 17:29:50.194899 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:50.194862 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8pfrd_96a57a22-0f7f-4408-bbaf-0671067e8c5d/discovery/0.log" Apr 24 17:29:50.979435 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:50.979404 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-x9t7n_241305b7-c446-4de1-b201-1446823b4e25/manager/0.log" Apr 24 17:29:51.057422 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:51.057395 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bjv8q_be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3/manager/0.log" Apr 24 17:29:51.889415 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:51.889388 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-x9t7n_241305b7-c446-4de1-b201-1446823b4e25/manager/0.log" Apr 24 17:29:51.948930 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:51.948903 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bjv8q_be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3/manager/0.log" Apr 24 17:29:52.775570 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:52.775540 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-x9t7n_241305b7-c446-4de1-b201-1446823b4e25/manager/0.log" Apr 24 17:29:52.836348 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:52.836305 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bjv8q_be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3/manager/0.log" Apr 24 17:29:53.654281 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:53.654249 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-x9t7n_241305b7-c446-4de1-b201-1446823b4e25/manager/0.log" Apr 24 17:29:53.712928 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:53.712898 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bjv8q_be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3/manager/0.log" Apr 24 17:29:54.527702 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:54.527672 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-x9t7n_241305b7-c446-4de1-b201-1446823b4e25/manager/0.log" Apr 24 17:29:54.588004 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:29:54.587970 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bjv8q_be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3/manager/0.log" Apr 24 17:30:00.079810 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:00.079783 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6swzl_c883a6e5-b74b-4d33-9372-a3da5fd267f6/global-pull-secret-syncer/0.log" Apr 24 17:30:00.235708 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:00.235682 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xxmc2_e21e6557-bb10-4c31-b3ef-7bcfc18c9d27/konnectivity-agent/0.log" Apr 24 17:30:00.280094 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:00.280069 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-69.ec2.internal_f61c386af5d7ea584ac3696d9ab65309/haproxy/0.log" Apr 24 17:30:04.672768 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:04.672738 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-x9t7n_241305b7-c446-4de1-b201-1446823b4e25/manager/0.log" Apr 24 17:30:04.788253 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:04.788224 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-bjv8q_be1d3f54-1ae3-44d1-ae31-a9f37c8a1ea3/manager/0.log" Apr 24 17:30:06.371364 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:06.371335 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7v47c_7c06394a-d2a9-474f-8448-b491ea74f9df/node-exporter/0.log" Apr 24 17:30:06.392976 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:06.392952 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7v47c_7c06394a-d2a9-474f-8448-b491ea74f9df/kube-rbac-proxy/0.log" Apr 24 17:30:06.412630 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:06.412613 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7v47c_7c06394a-d2a9-474f-8448-b491ea74f9df/init-textfile/0.log" Apr 24 17:30:09.065663 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.065624 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67dc464fdd-fvrg2_5e07f75a-06de-48fb-a96f-5df55fd55f5d/console/0.log" Apr 24 17:30:09.375200 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.375113 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2"] Apr 24 17:30:09.375447 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.375434 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8fffed1-71d1-4b6e-be08-88db6fedf6a7" containerName="manager" Apr 24 17:30:09.375493 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.375449 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fffed1-71d1-4b6e-be08-88db6fedf6a7" containerName="manager" Apr 24 17:30:09.375530 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.375503 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8fffed1-71d1-4b6e-be08-88db6fedf6a7" containerName="manager" Apr 24 17:30:09.378372 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.378356 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.381451 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.381422 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-47mg2\"/\"kube-root-ca.crt\"" Apr 24 17:30:09.382255 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.382235 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-47mg2\"/\"default-dockercfg-bn9xv\"" Apr 24 17:30:09.382369 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.382261 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-47mg2\"/\"openshift-service-ca.crt\"" Apr 24 17:30:09.386091 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.386068 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2"] Apr 24 17:30:09.422721 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.422695 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-podres\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.422845 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.422753 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-proc\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.422845 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.422821 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-lib-modules\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.422964 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.422860 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5p5s\" (UniqueName: \"kubernetes.io/projected/2441056a-c208-43ec-a9e6-016a97507516-kube-api-access-d5p5s\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.422964 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.422899 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-sys\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524171 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524146 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-podres\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524295 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524196 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-proc\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524295 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524219 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-lib-modules\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524295 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524243 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5p5s\" (UniqueName: \"kubernetes.io/projected/2441056a-c208-43ec-a9e6-016a97507516-kube-api-access-d5p5s\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524295 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524265 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-sys\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524295 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524265 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-proc\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524295 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524278 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-podres\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524534 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524388 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-lib-modules\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.524534 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.524396 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2441056a-c208-43ec-a9e6-016a97507516-sys\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.532537 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.532511 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5p5s\" (UniqueName: \"kubernetes.io/projected/2441056a-c208-43ec-a9e6-016a97507516-kube-api-access-d5p5s\") pod \"perf-node-gather-daemonset-7jwh2\" (UID: \"2441056a-c208-43ec-a9e6-016a97507516\") " pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.688421 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.688401 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:09.803735 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.803645 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2"] Apr 24 17:30:09.806352 ip-10-0-137-69 kubenswrapper[2563]: W0424 17:30:09.806321 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2441056a_c208_43ec_a9e6_016a97507516.slice/crio-a5f4ef81de294204e6eadb10ea41d9c47696a1cb7fbb4ea077a4d0cdbc8e50eb WatchSource:0}: Error finding container a5f4ef81de294204e6eadb10ea41d9c47696a1cb7fbb4ea077a4d0cdbc8e50eb: Status 404 returned error can't find the container with id a5f4ef81de294204e6eadb10ea41d9c47696a1cb7fbb4ea077a4d0cdbc8e50eb Apr 24 17:30:09.807816 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:09.807798 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:30:10.366571 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:10.366534 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" event={"ID":"2441056a-c208-43ec-a9e6-016a97507516","Type":"ContainerStarted","Data":"37f03efa089f2abdc75a6ec7a8cf1f0480b90e8a4b4dcf2edad3eded7a4d8f38"} Apr 24 17:30:10.366571 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:10.366572 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" event={"ID":"2441056a-c208-43ec-a9e6-016a97507516","Type":"ContainerStarted","Data":"a5f4ef81de294204e6eadb10ea41d9c47696a1cb7fbb4ea077a4d0cdbc8e50eb"} Apr 24 17:30:10.367061 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:10.366624 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:10.382110 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:10.382064 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" podStartSLOduration=1.3820487049999999 podStartE2EDuration="1.382048705s" podCreationTimestamp="2026-04-24 17:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:30:10.38077235 +0000 UTC m=+3069.051403275" watchObservedRunningTime="2026-04-24 17:30:10.382048705 +0000 UTC m=+3069.052679608" Apr 24 17:30:10.408406 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:10.408380 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vwpz2_086b1d61-526e-4540-a577-70f6d4cc1109/dns/0.log" Apr 24 17:30:10.428697 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:10.428678 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vwpz2_086b1d61-526e-4540-a577-70f6d4cc1109/kube-rbac-proxy/0.log" Apr 24 17:30:10.450448 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:10.450426 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5dlzs_b514d98b-1969-443b-b7cf-8c931162148a/dns-node-resolver/0.log" Apr 24 17:30:11.019558 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:11.019529 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j59hh_c532fe1e-4595-4088-b40a-8ee0058e4ccd/node-ca/0.log" Apr 24 17:30:11.846855 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:11.846825 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8pfrd_96a57a22-0f7f-4408-bbaf-0671067e8c5d/discovery/0.log" Apr 24 17:30:12.351420 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:12.351393 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rcmmv_5f5bde11-c32a-402f-994e-143e03a8dd70/serve-healthcheck-canary/0.log" Apr 24 17:30:12.881586 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:12.881555 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fkrh6_f99073ce-374c-4453-a1aa-6ee180e85c92/kube-rbac-proxy/0.log" Apr 24 17:30:12.902498 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:12.902471 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fkrh6_f99073ce-374c-4453-a1aa-6ee180e85c92/exporter/0.log" Apr 24 17:30:12.925115 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:12.925090 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fkrh6_f99073ce-374c-4453-a1aa-6ee180e85c92/extractor/0.log" Apr 24 17:30:16.114729 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:16.114698 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7f7fb4c66f-9zdjk_e244dc29-8bf9-41c2-8af1-fd6e2a17fd35/manager/0.log" Apr 24 17:30:16.158022 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:16.157978 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-g9g52_c25daac5-251d-47fa-a76a-9a28fd34cfa9/server/0.log" Apr 24 17:30:16.378958 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:16.378890 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-47mg2/perf-node-gather-daemonset-7jwh2" Apr 24 17:30:16.390201 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:16.390179 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-4267n_b407162a-89ff-4f61-8939-6b050381662f/seaweedfs/0.log" Apr 24 17:30:22.722417 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:22.722384 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhp56_36eba3f6-5e75-4e04-8052-6248d70f2dd3/kube-multus-additional-cni-plugins/0.log" Apr 24 17:30:22.742268 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:22.742247 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhp56_36eba3f6-5e75-4e04-8052-6248d70f2dd3/egress-router-binary-copy/0.log" Apr 24 17:30:22.763579 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:22.763555 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhp56_36eba3f6-5e75-4e04-8052-6248d70f2dd3/cni-plugins/0.log" Apr 24 17:30:22.784394 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:22.784371 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhp56_36eba3f6-5e75-4e04-8052-6248d70f2dd3/bond-cni-plugin/0.log" Apr 24 17:30:22.805458 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:22.805435 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhp56_36eba3f6-5e75-4e04-8052-6248d70f2dd3/routeoverride-cni/0.log" Apr 24 17:30:22.825086 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:22.825066 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhp56_36eba3f6-5e75-4e04-8052-6248d70f2dd3/whereabouts-cni-bincopy/0.log" Apr 24 17:30:22.845256 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:22.845235 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhp56_36eba3f6-5e75-4e04-8052-6248d70f2dd3/whereabouts-cni/0.log" Apr 24 17:30:23.098836 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:23.098767 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzbf9_71bf5297-d6b6-4b3b-a109-b99777f79b22/kube-multus/0.log" Apr 24 17:30:23.118485 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:23.118461 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bzld5_b98926f9-5237-4269-877b-422b4e1c6edf/network-metrics-daemon/0.log" Apr 24 17:30:23.138503 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:23.138484 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bzld5_b98926f9-5237-4269-877b-422b4e1c6edf/kube-rbac-proxy/0.log" Apr 24 17:30:24.326910 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:24.326880 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dmn7d_55acf3aa-ca3c-4efa-893f-473501b43621/ovn-controller/0.log" Apr 24 17:30:24.357583 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:24.357549 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dmn7d_55acf3aa-ca3c-4efa-893f-473501b43621/ovn-acl-logging/0.log" Apr 24 17:30:24.379195 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:24.379175 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dmn7d_55acf3aa-ca3c-4efa-893f-473501b43621/kube-rbac-proxy-node/0.log" Apr 24 17:30:24.400789 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:24.400771 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dmn7d_55acf3aa-ca3c-4efa-893f-473501b43621/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 17:30:24.418263 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:24.418247 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dmn7d_55acf3aa-ca3c-4efa-893f-473501b43621/northd/0.log" Apr 24 17:30:24.456576 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:24.456543 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dmn7d_55acf3aa-ca3c-4efa-893f-473501b43621/nbdb/0.log" Apr 24 17:30:24.526122 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:24.526103 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dmn7d_55acf3aa-ca3c-4efa-893f-473501b43621/sbdb/0.log" Apr 24 17:30:24.640886 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:24.640809 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dmn7d_55acf3aa-ca3c-4efa-893f-473501b43621/ovnkube-controller/0.log" Apr 24 17:30:26.031910 ip-10-0-137-69 kubenswrapper[2563]: I0424 17:30:26.031880 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-spk8g_1fb076e8-3881-4444-98b6-2d67d3820579/network-check-target-container/0.log"