Apr 23 16:35:05.329199 ip-10-0-141-189 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:35:05.778377 ip-10-0-141-189 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:05.778377 ip-10-0-141-189 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:35:05.779378 ip-10-0-141-189 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:05.779378 ip-10-0-141-189 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:35:05.779378 ip-10-0-141-189 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:05.781029 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.780929 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:35:05.786579 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786564 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:05.786579 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786580 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786584 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786587 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786597 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786600 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786603 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786606 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786608 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786611 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786616 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786620 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786623 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786626 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786628 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786631 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786634 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786637 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786639 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786642 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786645 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:05.786640 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786648 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786651 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786653 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786656 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786659 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786661 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786664 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786667 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786669 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786672 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786674 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786676 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786679 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786682 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786685 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786688 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786690 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786693 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786695 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786697 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:05.787136 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786700 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786702 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786705 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786708 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786711 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786713 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786716 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786718 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786721 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786723 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786726 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786728 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786731 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786733 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786736 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786739 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786742 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786744 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786747 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:05.787718 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786749 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786752 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786754 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786757 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786759 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786762 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786764 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786767 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786769 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786772 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786774 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786777 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786779 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786782 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786784 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786787 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786789 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786792 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786794 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786798 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:05.788209 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786803 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:05.788683 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786806 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:05.788683 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786808 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:05.788683 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786811 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:05.788683 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786813 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:05.788683 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.786816 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:05.788853 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788840 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:05.788853 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788851 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788855 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788858 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788861 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788863 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788866 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788868 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788871 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788889 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788892 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788894 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788897 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788900 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788903 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788913 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788916 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788918 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788921 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788923 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788925 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:05.788931 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788928 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788930 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788933 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788936 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788939 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788941 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788944 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788947 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788949 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788952 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788954 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788957 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788959 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788962 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788964 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788967 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788969 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788972 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788974 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788976 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:05.789428 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788979 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788982 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788984 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788986 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788989 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788992 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788994 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788997 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.788999 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789002 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789004 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789007 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789009 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789011 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789014 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789017 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789019 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789022 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789024 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789027 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:05.789952 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789029 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789033 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789035 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789038 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789041 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789045 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789048 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789051 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789054 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789057 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789060 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789062 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789065 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789068 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789072 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789076 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789078 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789081 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789084 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:05.790438 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789086 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789089 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789091 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789094 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789096 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789099 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789162 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789172 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789179 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789183 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789188 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789191 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789196 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789200 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789203 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789207 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789210 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789214 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789217 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789220 2561 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789223 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789226 2561 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789229 2561 flags.go:64] FLAG: --cloud-config="" Apr 23 16:35:05.790913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789231 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789235 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789238 2561 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789241 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789244 2561 flags.go:64] FLAG: --config-dir="" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789247 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789250 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789254 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789257 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789261 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789264 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789267 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789270 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789273 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789276 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789279 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789283 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789286 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789289 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789292 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789295 2561 flags.go:64] FLAG: --enable-server="true" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789298 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789302 2561 flags.go:64] FLAG: --event-burst="100" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789305 2561 flags.go:64] FLAG: --event-qps="50" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789308 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:35:05.791488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789311 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789314 2561 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789318 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789321 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789324 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789327 2561 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789330 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789333 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789336 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789339 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789342 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789344 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789347 2561 flags.go:64] FLAG: --feature-gates="" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789351 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789354 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789357 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789360 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789364 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789367 2561 flags.go:64] FLAG: --help="false" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789369 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-141-189.ec2.internal" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789372 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789375 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789379 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789383 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:35:05.792101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789386 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789389 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789392 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789394 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789397 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789400 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789403 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789406 2561 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789409 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789412 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789415 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789417 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789420 2561 flags.go:64] FLAG: --lock-file="" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789423 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789426 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789429 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789434 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789437 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789440 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789443 2561 flags.go:64] FLAG: --logging-format="text" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789446 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789449 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789452 2561 flags.go:64] FLAG: --manifest-url="" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789455 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789460 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:35:05.792721 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789475 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789480 2561 flags.go:64] FLAG: --max-pods="110" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789484 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789487 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789490 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789493 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789496 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789499 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789502 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789509 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789512 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789515 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789518 2561 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789521 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789527 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789529 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789532 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789535 2561 flags.go:64] FLAG: --port="10250" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789539 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789542 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00e0b18962d5c6610" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789545 2561 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789548 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789551 2561 flags.go:64] FLAG: --register-node="true" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789554 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:35:05.793332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789556 2561 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789560 2561 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789563 2561 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789565 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789568 2561 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789572 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789575 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789578 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789581 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789583 2561 flags.go:64] FLAG: --runonce="false" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789587 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789590 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789593 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789596 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789599 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789601 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789604 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789607 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789610 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789613 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789616 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789618 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789621 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789624 2561 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789627 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:35:05.793954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789632 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789635 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789638 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789641 2561 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789644 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789646 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789649 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789652 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789654 2561 flags.go:64] FLAG: --v="2" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789659 2561 flags.go:64] FLAG: --version="false" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789662 2561 flags.go:64] FLAG: --vmodule="" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789667 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.789670 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789756 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789760 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789763 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789766 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789769 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789772 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789775 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789778 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789781 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789784 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:05.794594 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789786 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789789 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789792 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789795 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789797 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789801 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789806 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789809 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789812 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789815 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789818 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789821 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789824 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789826 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789829 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789831 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789834 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789836 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789839 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:05.795152 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789841 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789844 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789847 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789849 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789852 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789855 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789857 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789860 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789862 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789865 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789868 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789871 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789885 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789888 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789891 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789893 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789896 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789899 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789901 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789904 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:05.795657 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789906 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789909 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789911 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789913 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789916 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789918 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789921 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789923 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789927 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789931 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789935 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789937 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789940 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789942 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789945 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789948 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789951 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789954 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789957 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:05.796439 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789960 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789963 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789965 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789972 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789975 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789978 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789980 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789983 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789986 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789989 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789991 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789994 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789997 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.789999 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.790002 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.790004 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.790007 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:05.797089 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.790010 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:05.797510 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.790701 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:05.798273 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.798255 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:35:05.798305 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.798273 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:35:05.798343 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798333 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:05.798343 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798342 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798346 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798350 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798353 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798356 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798359 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798364 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798368 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798371 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798374 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798378 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798380 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798384 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798387 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798389 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798392 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798395 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798398 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798400 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:05.798406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798403 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798406 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798408 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798418 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798421 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798424 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798426 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798429 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798432 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798434 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798437 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798439 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798442 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798444 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798447 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798449 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798452 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798454 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798457 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:05.798925 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798459 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798461 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798464 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798466 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798469 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798472 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798474 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798477 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798480 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798483 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798487 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798490 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798492 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798495 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798497 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798501 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798503 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798506 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798509 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798511 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:05.799386 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798514 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798516 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798519 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798522 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798524 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798527 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798530 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798533 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798535 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798538 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798540 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798543 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798546 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798548 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798550 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798553 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798556 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798564 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798567 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798570 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:05.799887 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798572 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798575 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798577 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798580 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798582 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798585 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798587 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.798593 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798698 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798704 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798707 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798710 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798713 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798715 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798718 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798720 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:05.800368 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798723 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798725 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798728 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798730 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798733 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798735 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798738 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798740 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798743 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798746 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798748 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798751 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798754 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798757 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798766 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798771 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798774 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798777 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798780 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:05.800757 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798782 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798785 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798787 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798791 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798795 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798798 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798800 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798803 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798805 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798808 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798811 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798813 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798816 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798818 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798821 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798823 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798826 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798828 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798831 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:05.801249 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798834 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798836 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798839 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798841 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798844 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798846 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798848 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798851 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798853 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798857 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798860 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798862 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798865 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798867 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798870 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798886 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798889 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798891 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798894 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798897 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:05.801703 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798900 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798902 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798905 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798907 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798910 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798912 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798915 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798917 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798920 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798923 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798926 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798928 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798931 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798934 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798936 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798939 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798942 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798944 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798947 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:05.802198 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:05.798949 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:05.802685 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.798954 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:05.802685 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.799706 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:35:05.803378 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.803364 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:35:05.804446 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.804435 2561 server.go:1019] "Starting client certificate rotation" Apr 23 16:35:05.804543 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.804529 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:05.804573 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.804569 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:05.830216 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.830200 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:05.832230 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.832202 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:05.845943 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.845921 2561 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:35:05.852887 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.852861 2561 log.go:25] "Validated CRI v1 image API" Apr 23 16:35:05.857581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.857556 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:35:05.859514 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.859495 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:05.865066 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.865042 2561 fs.go:135] Filesystem UUIDs: map[0357d501-3bb3-400c-a978-7fa50cdda464:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8e826022-bd80-4c40-bc5f-f2c3dc7e6289:/dev/nvme0n1p4] Apr 23 16:35:05.865161 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.865064 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:35:05.870827 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.870718 2561 manager.go:217] Machine: {Timestamp:2026-04-23 16:35:05.868626338 +0000 UTC m=+0.422060499 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096439 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2aae254344302bf9cbc7ff28556dc3 SystemUUID:ec2aae25-4344-302b-f9cb-c7ff28556dc3 BootID:05df50fa-3475-47c7-8184-24b3007a68b4 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:92:3f:71:dc:93 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:92:3f:71:dc:93 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:0b:15:30:68:32 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:35:05.870827 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.870815 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:35:05.871006 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.870926 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:35:05.871299 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.871276 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:35:05.871447 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.871299 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-189.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:35:05.871529 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.871460 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:35:05.871529 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.871473 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:35:05.871529 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.871491 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:05.872280 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.872268 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:05.873751 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.873738 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:05.873896 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.873868 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:35:05.876993 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.876981 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:35:05.877067 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.876997 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:35:05.877067 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.877018 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:35:05.877067 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.877031 2561 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:35:05.877067 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.877042 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:35:05.878125 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.878113 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:05.878188 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.878135 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:05.881230 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.881212 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:35:05.883575 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.883560 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:35:05.885416 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885404 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885420 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885427 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885432 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885437 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885443 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885449 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885454 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885462 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885468 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:35:05.885483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885483 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:35:05.885731 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.885491 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:35:05.886304 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.886295 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:35:05.886304 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.886304 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:35:05.889495 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.889480 2561 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-189.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:35:05.889850 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.889832 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:35:05.889952 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.889885 2561 server.go:1295] "Started kubelet" Apr 23 16:35:05.890055 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.890016 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:35:05.890101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.890072 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:35:05.890500 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.890478 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:35:05.890500 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.890481 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-189.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:35:05.890633 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.890509 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:35:05.890627 ip-10-0-141-189 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:35:05.891844 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.891825 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:35:05.893139 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.893123 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:35:05.898545 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.898524 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:05.899017 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.897811 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-189.ec2.internal.18a9099edce4fa6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-189.ec2.internal,UID:ip-10-0-141-189.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-189.ec2.internal,},FirstTimestamp:2026-04-23 16:35:05.889847916 +0000 UTC m=+0.443282084,LastTimestamp:2026-04-23 16:35:05.889847916 +0000 UTC m=+0.443282084,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-189.ec2.internal,}" Apr 23 16:35:05.899172 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.899159 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:35:05.899628 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.899610 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:35:05.899989 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.899970 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:35:05.899989 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.899973 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:35:05.900100 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.899996 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:35:05.900100 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.900047 2561 factory.go:153] Registering CRI-O factory Apr 23 16:35:05.900100 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.900098 2561 factory.go:223] Registration of the crio container factory successfully Apr 23 16:35:05.900203 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.900146 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:35:05.901207 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.901184 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:35:05.901207 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.901207 2561 factory.go:55] Registering systemd factory Apr 23 16:35:05.901358 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.901216 2561 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:35:05.901358 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.901240 2561 factory.go:103] Registering Raw factory Apr 23 16:35:05.901358 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.901190 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:35:05.901358 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.901237 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:05.901358 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.901255 2561 manager.go:1196] Started watching for new ooms in manager Apr 23 16:35:05.902055 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.902037 2561 manager.go:319] Starting recovery of all containers Apr 23 16:35:05.906186 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.906164 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-189.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 16:35:05.907269 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.907239 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 16:35:05.910123 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.910095 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:35:05.913524 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.913362 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ssr5p" Apr 23 16:35:05.914396 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.914381 2561 manager.go:324] Recovery completed Apr 23 16:35:05.918796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.918784 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:05.919980 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.919963 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ssr5p" Apr 23 16:35:05.921359 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.921345 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:05.921448 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.921369 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:05.921448 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.921379 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:05.921868 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.921855 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:35:05.921868 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.921867 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:35:05.921950 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.921896 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:05.923321 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.923266 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-189.ec2.internal.18a9099edec5c49d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-189.ec2.internal,UID:ip-10-0-141-189.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-189.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-189.ec2.internal,},FirstTimestamp:2026-04-23 16:35:05.921356957 +0000 UTC m=+0.474791116,LastTimestamp:2026-04-23 16:35:05.921356957 +0000 UTC m=+0.474791116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-189.ec2.internal,}" Apr 23 16:35:05.924363 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.924352 2561 policy_none.go:49] "None policy: Start" Apr 23 16:35:05.924415 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.924369 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:35:05.924415 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.924379 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:35:05.956439 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.956428 2561 manager.go:341] "Starting Device Plugin manager" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.956459 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.956468 2561 server.go:85] "Starting device plugin registration server" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.956623 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.956634 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.956733 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.956795 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:05.956800 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.957455 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:35:05.970703 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:05.957489 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.027035 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.027017 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:35:06.027132 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.027044 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:35:06.027132 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.027059 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:35:06.027132 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.027068 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:35:06.027132 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.027110 2561 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:35:06.029614 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.029573 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:06.056932 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.056914 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:06.057604 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.057586 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:06.057681 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.057611 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:06.057681 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.057621 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:06.057681 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.057639 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.064563 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.064549 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.064632 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.064568 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-189.ec2.internal\": node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.097421 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.097399 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.127489 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.127468 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal"] Apr 23 16:35:06.127570 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.127537 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:06.128277 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.128262 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:06.128343 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.128292 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:06.128343 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.128303 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:06.129538 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.129526 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:06.129680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.129666 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.129748 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.129692 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:06.130185 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.130171 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:06.130248 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.130190 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:06.130248 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.130199 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:06.130248 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.130212 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:06.130248 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.130225 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:06.130394 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.130213 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:06.131692 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.131678 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.131753 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.131701 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:06.132287 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.132272 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:06.132351 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.132301 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:06.132351 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.132313 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:06.154481 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.154462 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-189.ec2.internal\" not found" node="ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.158656 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.158642 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-189.ec2.internal\" not found" node="ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.197905 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.197864 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.203557 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.203541 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17dbb9da6ea0f39e403c5d7aceab008d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"17dbb9da6ea0f39e403c5d7aceab008d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.203631 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.203564 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17dbb9da6ea0f39e403c5d7aceab008d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"17dbb9da6ea0f39e403c5d7aceab008d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.203631 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.203581 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/11fa0aa737c019b5fa81fed20955549f-config\") pod \"kube-apiserver-proxy-ip-10-0-141-189.ec2.internal\" (UID: \"11fa0aa737c019b5fa81fed20955549f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.298423 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.298352 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.304729 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.304709 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17dbb9da6ea0f39e403c5d7aceab008d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"17dbb9da6ea0f39e403c5d7aceab008d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.304786 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.304734 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17dbb9da6ea0f39e403c5d7aceab008d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"17dbb9da6ea0f39e403c5d7aceab008d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.304786 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.304750 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/11fa0aa737c019b5fa81fed20955549f-config\") pod \"kube-apiserver-proxy-ip-10-0-141-189.ec2.internal\" (UID: \"11fa0aa737c019b5fa81fed20955549f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.304855 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.304797 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/11fa0aa737c019b5fa81fed20955549f-config\") pod \"kube-apiserver-proxy-ip-10-0-141-189.ec2.internal\" (UID: \"11fa0aa737c019b5fa81fed20955549f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.304855 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.304815 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17dbb9da6ea0f39e403c5d7aceab008d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"17dbb9da6ea0f39e403c5d7aceab008d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.304855 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.304815 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17dbb9da6ea0f39e403c5d7aceab008d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal\" (UID: \"17dbb9da6ea0f39e403c5d7aceab008d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.399130 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.399094 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.456481 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.456459 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.462125 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.462095 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 23 16:35:06.499448 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.499423 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.599935 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.599835 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.700293 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.700264 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.800719 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.800691 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.803855 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.803842 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:35:06.803994 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.803978 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:06.899417 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.899338 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:06.900770 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:06.900747 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:06.914680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.914657 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:06.922422 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.922392 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:30:05 +0000 UTC" deadline="2027-11-28 17:46:57.889458612 +0000 UTC" Apr 23 16:35:06.922422 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.922422 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14017h11m50.967040557s" Apr 23 16:35:06.945036 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.945016 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nfd8r" Apr 23 16:35:06.955865 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.955849 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nfd8r" Apr 23 16:35:06.997172 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:06.997148 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:07.001542 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:07.001526 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:07.102418 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:07.102396 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-189.ec2.internal\" not found" Apr 23 16:35:07.105753 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:07.105724 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11fa0aa737c019b5fa81fed20955549f.slice/crio-f123a4cf95493ea95c2cf525b2297fb350653cb6c702cf532c28ab8f1281b768 WatchSource:0}: Error finding container f123a4cf95493ea95c2cf525b2297fb350653cb6c702cf532c28ab8f1281b768: Status 404 returned error can't find the container with id f123a4cf95493ea95c2cf525b2297fb350653cb6c702cf532c28ab8f1281b768 Apr 23 16:35:07.106220 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:07.106201 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17dbb9da6ea0f39e403c5d7aceab008d.slice/crio-955136a60774489d71b36f91f9cfd0911d1721534976453ce237c58cbbb0c26b WatchSource:0}: Error finding container 955136a60774489d71b36f91f9cfd0911d1721534976453ce237c58cbbb0c26b: Status 404 returned error can't find the container with id 955136a60774489d71b36f91f9cfd0911d1721534976453ce237c58cbbb0c26b Apr 23 16:35:07.111101 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.111086 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:35:07.129554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.129538 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:07.199032 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.198978 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:07.200292 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.200277 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" Apr 23 16:35:07.218793 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.218774 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:07.219747 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.219735 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" Apr 23 16:35:07.229804 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.229790 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:07.713066 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.713040 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:07.877859 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.877828 2561 apiserver.go:52] "Watching apiserver" Apr 23 16:35:07.886449 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.886364 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:35:07.888528 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.888469 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rg2cr","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh","openshift-multus/multus-additional-cni-plugins-6tcrq","openshift-network-diagnostics/network-check-target-bclp4","openshift-network-operator/iptables-alerter-hvjxm","openshift-ovn-kubernetes/ovnkube-node-jhhzs","kube-system/konnectivity-agent-8c5j2","kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal","openshift-cluster-node-tuning-operator/tuned-gmwdt","openshift-dns/node-resolver-476qf","openshift-image-registry/node-ca-7wntm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal","openshift-multus/multus-24kss"] Apr 23 16:35:07.890599 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.890576 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:07.892751 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.892726 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:07.892852 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.892835 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.894037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.893989 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:35:07.894451 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.894339 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-b995b\"" Apr 23 16:35:07.894451 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.894385 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:35:07.895387 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.895367 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:35:07.895621 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.895530 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:35:07.895701 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.895643 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-hlwxr\"" Apr 23 16:35:07.895701 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.895644 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:35:07.895805 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.895741 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:35:07.895862 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.895808 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:35:07.895939 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.895899 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nlksj\"" Apr 23 16:35:07.896012 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.895991 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:35:07.896741 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.896721 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:35:07.896812 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.896731 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:35:07.897870 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.897848 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:07.898172 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:07.898141 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:07.898249 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.898175 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:07.901766 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.901746 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:07.902039 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.902020 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:07.902146 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:07.902116 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:07.902206 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.902176 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.904069 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.903949 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:07.904354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.904194 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-75hxj\"" Apr 23 16:35:07.904354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.904237 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:35:07.904640 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.904623 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:35:07.904778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.904759 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:35:07.904961 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.904941 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:07.905061 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.904969 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:07.905061 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.905054 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:35:07.905509 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.905492 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:35:07.906260 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.906215 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:35:07.906260 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.906242 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rndf6\"" Apr 23 16:35:07.906575 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.906557 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:07.907490 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.907475 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:35:07.907706 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.907688 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-24kss" Apr 23 16:35:07.908259 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.908108 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:07.908259 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.908136 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dwlr4\"" Apr 23 16:35:07.908259 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.908144 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-cjg42\"" Apr 23 16:35:07.908259 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.908150 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:07.908259 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.908163 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:07.908556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.908456 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:07.909102 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.909082 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:35:07.909631 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.909612 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:35:07.909728 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.909712 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:35:07.910431 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.910410 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vggrm\"" Apr 23 16:35:07.910635 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.910619 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zd6rx\"" Apr 23 16:35:07.910715 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.910645 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:35:07.912898 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.912863 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51bf838a-6ae8-4ed3-bdd7-7040c25cac11-konnectivity-ca\") pod \"konnectivity-agent-8c5j2\" (UID: \"51bf838a-6ae8-4ed3-bdd7-7040c25cac11\") " pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:07.913060 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913036 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:07.913150 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913070 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-sys-fs\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:07.913150 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913095 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlbn\" (UniqueName: \"kubernetes.io/projected/c9b78ad8-b01b-45a1-b022-80a305241f06-kube-api-access-wqlbn\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:07.913262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913146 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0b73ac93-0681-4ba9-8f35-d63197135397-host-slash\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:07.913262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913170 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.913262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913197 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovn-node-metrics-cert\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.913262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913222 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-systemd-units\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.913262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913237 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-var-lib-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.913262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913250 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-etc-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913266 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51bf838a-6ae8-4ed3-bdd7-7040c25cac11-agent-certs\") pod \"konnectivity-agent-8c5j2\" (UID: \"51bf838a-6ae8-4ed3-bdd7-7040c25cac11\") " pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913288 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-registration-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913311 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913337 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrhfn\" (UniqueName: \"kubernetes.io/projected/0b73ac93-0681-4ba9-8f35-d63197135397-kube-api-access-vrhfn\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913358 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-node-log\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913379 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-socket-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913403 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-os-release\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913454 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913478 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913502 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-kubelet\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.913556 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913523 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-ovn\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913571 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-log-socket\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913607 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-env-overrides\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913635 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cni-binary-copy\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913690 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0b73ac93-0681-4ba9-8f35-d63197135397-iptables-alerter-script\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913713 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913763 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-etc-selinux\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913788 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-system-cni-dir\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913828 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf4k8\" (UniqueName: \"kubernetes.io/projected/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-kube-api-access-pf4k8\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913853 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-slash\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913906 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-cni-bin\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913959 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovnkube-script-lib\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.913987 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2mdl\" (UniqueName: \"kubernetes.io/projected/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-kube-api-access-g2mdl\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.914012 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-device-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.914036 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.914060 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914098 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.914095 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovnkube-config\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914841 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.914122 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cnibin\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:07.914841 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.914154 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-run-netns\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914841 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.914202 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-systemd\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.914841 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.914243 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-cni-netd\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:07.958225 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.958194 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:06 +0000 UTC" deadline="2027-11-17 12:41:44.999481068 +0000 UTC" Apr 23 16:35:07.958225 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:07.958225 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13748h6m37.041260788s" Apr 23 16:35:08.000863 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.000803 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:35:08.015025 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.014997 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:08.015160 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015036 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvqfq\" (UniqueName: \"kubernetes.io/projected/5c564efe-4a26-4498-9f97-d71703d0aa18-kube-api-access-gvqfq\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:08.015160 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015062 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysctl-d\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.015160 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015104 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-run\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.015320 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015160 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-socket-dir-parent\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.015320 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015200 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-slash\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.015320 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015245 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-slash\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.015320 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015250 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2mdl\" (UniqueName: \"kubernetes.io/projected/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-kube-api-access-g2mdl\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.015320 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015287 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015344 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/543e930d-5825-406c-b2da-236f7eef2b83-tmp\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015377 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015405 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovnkube-config\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015422 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015431 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-modprobe-d\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015448 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015495 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015518 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-sys-fs\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.015554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015547 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-systemd\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015577 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015612 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-sys-fs\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015634 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2h54\" (UniqueName: \"kubernetes.io/projected/543e930d-5825-406c-b2da-236f7eef2b83-kube-api-access-p2h54\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015659 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-cni-binary-copy\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015682 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-k8s-cni-cncf-io\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015702 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-conf-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015734 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0b73ac93-0681-4ba9-8f35-d63197135397-host-slash\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015758 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015781 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovn-node-metrics-cert\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015785 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0b73ac93-0681-4ba9-8f35-d63197135397-host-slash\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015805 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-cni-bin\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015831 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-systemd-units\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015836 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015867 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-etc-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015906 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-systemd-units\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015949 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51bf838a-6ae8-4ed3-bdd7-7040c25cac11-agent-certs\") pod \"konnectivity-agent-8c5j2\" (UID: \"51bf838a-6ae8-4ed3-bdd7-7040c25cac11\") " pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:08.016037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015953 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovnkube-config\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.015975 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-registration-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016004 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcdc6\" (UniqueName: \"kubernetes.io/projected/9fd1b1de-6101-4e7d-a523-325e848a740a-kube-api-access-hcdc6\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016013 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-etc-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016109 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016129 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrhfn\" (UniqueName: \"kubernetes.io/projected/0b73ac93-0681-4ba9-8f35-d63197135397-kube-api-access-vrhfn\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016143 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-registration-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016199 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-node-log\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016335 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-node-log\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016344 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-socket-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016386 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-os-release\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016442 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-system-cni-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016485 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-cnibin\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016491 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-socket-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016493 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-os-release\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016521 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-log-socket\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016543 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-env-overrides\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.016796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016544 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-log-socket\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016565 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cni-binary-copy\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016590 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-sys\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016614 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-lib-modules\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016636 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-etc-kubernetes\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016664 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0b73ac93-0681-4ba9-8f35-d63197135397-iptables-alerter-script\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016689 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016712 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf4k8\" (UniqueName: \"kubernetes.io/projected/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-kube-api-access-pf4k8\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016737 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/842de44e-cb5c-472a-9cf5-7d48346188d8-hosts-file\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016760 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-os-release\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016784 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-multus-certs\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016809 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-cni-bin\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016836 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovnkube-script-lib\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016896 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-device-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016932 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysconfig\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016953 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-env-overrides\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.016956 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-var-lib-kubelet\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.017579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017010 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-cni-bin\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017014 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017050 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-cni-multus\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017070 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-device-dir\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017079 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-kubelet\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017124 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cnibin\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017151 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysctl-conf\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017157 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cni-binary-copy\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017193 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cnibin\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017191 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0b73ac93-0681-4ba9-8f35-d63197135397-iptables-alerter-script\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017203 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fd1b1de-6101-4e7d-a523-325e848a740a-host\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017234 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-hostroot\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017268 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-daemon-config\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017344 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-run-netns\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017392 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-run-netns\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017392 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-systemd\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017424 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-systemd\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.018354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017433 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-cni-netd\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017459 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51bf838a-6ae8-4ed3-bdd7-7040c25cac11-konnectivity-ca\") pod \"konnectivity-agent-8c5j2\" (UID: \"51bf838a-6ae8-4ed3-bdd7-7040c25cac11\") " pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017482 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-cni-netd\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017485 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlbn\" (UniqueName: \"kubernetes.io/projected/c9b78ad8-b01b-45a1-b022-80a305241f06-kube-api-access-wqlbn\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017513 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-kubernetes\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017529 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovnkube-script-lib\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017541 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/543e930d-5825-406c-b2da-236f7eef2b83-etc-tuned\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017565 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv2g2\" (UniqueName: \"kubernetes.io/projected/842de44e-cb5c-472a-9cf5-7d48346188d8-kube-api-access-tv2g2\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017589 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/842de44e-cb5c-472a-9cf5-7d48346188d8-tmp-dir\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017610 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fd1b1de-6101-4e7d-a523-325e848a740a-serviceca\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017634 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qkd9\" (UniqueName: \"kubernetes.io/projected/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-kube-api-access-4qkd9\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017670 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-var-lib-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017705 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017725 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-var-lib-openvswitch\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017736 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-netns\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017783 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017807 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-host\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.019141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017851 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017870 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51bf838a-6ae8-4ed3-bdd7-7040c25cac11-konnectivity-ca\") pod \"konnectivity-agent-8c5j2\" (UID: \"51bf838a-6ae8-4ed3-bdd7-7040c25cac11\") " pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017904 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-kubelet\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017930 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-ovn\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.017955 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-cni-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.018012 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-etc-selinux\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.018039 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-system-cni-dir\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.018128 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-run-ovn\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.018148 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-system-cni-dir\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.018172 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-host-kubelet\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.018275 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.018280 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c9b78ad8-b01b-45a1-b022-80a305241f06-etc-selinux\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.018784 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.019913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.019677 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-ovn-node-metrics-cert\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.020502 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.020067 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51bf838a-6ae8-4ed3-bdd7-7040c25cac11-agent-certs\") pod \"konnectivity-agent-8c5j2\" (UID: \"51bf838a-6ae8-4ed3-bdd7-7040c25cac11\") " pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:08.029538 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.029517 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:08.029645 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.029552 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:08.029645 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.029566 2561 projected.go:194] Error preparing data for projected volume kube-api-access-7dpwd for pod openshift-network-diagnostics/network-check-target-bclp4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:08.029750 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.029651 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd podName:2f8d3b70-fe21-4feb-a984-12133895766b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:08.529615239 +0000 UTC m=+3.083049397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7dpwd" (UniqueName: "kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd") pod "network-check-target-bclp4" (UID: "2f8d3b70-fe21-4feb-a984-12133895766b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:08.031364 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.031319 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" event={"ID":"11fa0aa737c019b5fa81fed20955549f","Type":"ContainerStarted","Data":"f123a4cf95493ea95c2cf525b2297fb350653cb6c702cf532c28ab8f1281b768"} Apr 23 16:35:08.031468 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.031427 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrhfn\" (UniqueName: \"kubernetes.io/projected/0b73ac93-0681-4ba9-8f35-d63197135397-kube-api-access-vrhfn\") pod \"iptables-alerter-hvjxm\" (UID: \"0b73ac93-0681-4ba9-8f35-d63197135397\") " pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:08.031976 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.031956 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlbn\" (UniqueName: \"kubernetes.io/projected/c9b78ad8-b01b-45a1-b022-80a305241f06-kube-api-access-wqlbn\") pod \"aws-ebs-csi-driver-node-9pzwh\" (UID: \"c9b78ad8-b01b-45a1-b022-80a305241f06\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.032427 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.032371 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" event={"ID":"17dbb9da6ea0f39e403c5d7aceab008d","Type":"ContainerStarted","Data":"955136a60774489d71b36f91f9cfd0911d1721534976453ce237c58cbbb0c26b"} Apr 23 16:35:08.033565 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.033548 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2mdl\" (UniqueName: \"kubernetes.io/projected/1f8739d4-2dae-4fa1-b0be-80c6e24dce30-kube-api-access-g2mdl\") pod \"ovnkube-node-jhhzs\" (UID: \"1f8739d4-2dae-4fa1-b0be-80c6e24dce30\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.034033 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.034010 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf4k8\" (UniqueName: \"kubernetes.io/projected/3b9f6eca-1a5a-42f3-acf5-01a9d352a780-kube-api-access-pf4k8\") pod \"multus-additional-cni-plugins-6tcrq\" (UID: \"3b9f6eca-1a5a-42f3-acf5-01a9d352a780\") " pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.118620 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118593 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-host\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.118778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118631 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-cni-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.118778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118656 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:08.118778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118679 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvqfq\" (UniqueName: \"kubernetes.io/projected/5c564efe-4a26-4498-9f97-d71703d0aa18-kube-api-access-gvqfq\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:08.118778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118703 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysctl-d\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.118778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118710 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-host\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.118778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118725 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-run\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.118778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118753 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-socket-dir-parent\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.118778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118777 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/543e930d-5825-406c-b2da-236f7eef2b83-tmp\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.118789 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118800 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-modprobe-d\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118823 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-systemd\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118823 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-cni-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118842 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysctl-d\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.118858 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:08.6188381 +0000 UTC m=+3.172272270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118887 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-socket-dir-parent\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118896 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2h54\" (UniqueName: \"kubernetes.io/projected/543e930d-5825-406c-b2da-236f7eef2b83-kube-api-access-p2h54\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118918 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-cni-binary-copy\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118918 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-run\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118933 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-modprobe-d\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118973 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-systemd\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.118975 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-k8s-cni-cncf-io\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119015 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-conf-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119044 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-cni-bin\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119050 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-conf-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119017 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-k8s-cni-cncf-io\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119073 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcdc6\" (UniqueName: \"kubernetes.io/projected/9fd1b1de-6101-4e7d-a523-325e848a740a-kube-api-access-hcdc6\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119099 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-system-cni-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119101 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-cni-bin\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119122 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-cnibin\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119164 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-system-cni-dir\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119156 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-sys\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119198 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-lib-modules\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119223 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-cnibin\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119237 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-sys\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119240 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-etc-kubernetes\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119276 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-etc-kubernetes\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119294 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/842de44e-cb5c-472a-9cf5-7d48346188d8-hosts-file\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119326 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-os-release\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119332 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/842de44e-cb5c-472a-9cf5-7d48346188d8-hosts-file\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119344 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-lib-modules\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119355 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-multus-certs\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119388 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-os-release\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.119897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119391 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysconfig\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119395 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-multus-certs\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119417 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-var-lib-kubelet\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119438 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysconfig\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119443 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-cni-multus\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119468 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-kubelet\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119481 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-var-lib-kubelet\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119495 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-cni-binary-copy\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119507 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-cni-multus\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119495 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysctl-conf\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119519 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-var-lib-kubelet\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119556 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fd1b1de-6101-4e7d-a523-325e848a740a-host\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119596 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fd1b1de-6101-4e7d-a523-325e848a740a-host\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119611 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-sysctl-conf\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119625 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-hostroot\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119654 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-daemon-config\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119685 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-kubernetes\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119708 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/543e930d-5825-406c-b2da-236f7eef2b83-etc-tuned\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.120680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119716 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-hostroot\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119731 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv2g2\" (UniqueName: \"kubernetes.io/projected/842de44e-cb5c-472a-9cf5-7d48346188d8-kube-api-access-tv2g2\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119743 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/543e930d-5825-406c-b2da-236f7eef2b83-etc-kubernetes\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119757 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/842de44e-cb5c-472a-9cf5-7d48346188d8-tmp-dir\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119781 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fd1b1de-6101-4e7d-a523-325e848a740a-serviceca\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119804 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qkd9\" (UniqueName: \"kubernetes.io/projected/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-kube-api-access-4qkd9\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.119830 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-netns\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.120592 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-host-run-netns\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.120819 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-multus-daemon-config\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.121221 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9fd1b1de-6101-4e7d-a523-325e848a740a-serviceca\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.121328 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/842de44e-cb5c-472a-9cf5-7d48346188d8-tmp-dir\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.121561 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.121556 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/543e930d-5825-406c-b2da-236f7eef2b83-tmp\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.124713 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.124690 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/543e930d-5825-406c-b2da-236f7eef2b83-etc-tuned\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.135612 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.135589 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvqfq\" (UniqueName: \"kubernetes.io/projected/5c564efe-4a26-4498-9f97-d71703d0aa18-kube-api-access-gvqfq\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:08.136029 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.136013 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcdc6\" (UniqueName: \"kubernetes.io/projected/9fd1b1de-6101-4e7d-a523-325e848a740a-kube-api-access-hcdc6\") pod \"node-ca-7wntm\" (UID: \"9fd1b1de-6101-4e7d-a523-325e848a740a\") " pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.142459 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.142432 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qkd9\" (UniqueName: \"kubernetes.io/projected/6b541e1a-ac49-4260-93b3-d5e6e7e04eb5-kube-api-access-4qkd9\") pod \"multus-24kss\" (UID: \"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5\") " pod="openshift-multus/multus-24kss" Apr 23 16:35:08.142741 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.142720 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv2g2\" (UniqueName: \"kubernetes.io/projected/842de44e-cb5c-472a-9cf5-7d48346188d8-kube-api-access-tv2g2\") pod \"node-resolver-476qf\" (UID: \"842de44e-cb5c-472a-9cf5-7d48346188d8\") " pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.142917 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.142898 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2h54\" (UniqueName: \"kubernetes.io/projected/543e930d-5825-406c-b2da-236f7eef2b83-kube-api-access-p2h54\") pod \"tuned-gmwdt\" (UID: \"543e930d-5825-406c-b2da-236f7eef2b83\") " pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.202865 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.202830 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:08.211652 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.211619 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" Apr 23 16:35:08.222279 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.222257 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" Apr 23 16:35:08.229839 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.229820 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hvjxm" Apr 23 16:35:08.236383 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.236365 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:08.243950 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.243929 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" Apr 23 16:35:08.251452 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.251402 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-476qf" Apr 23 16:35:08.256826 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.256789 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7wntm" Apr 23 16:35:08.263348 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.263327 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-24kss" Apr 23 16:35:08.624611 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.624574 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:08.624796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.624628 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:08.624796 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.624754 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:08.624796 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.624757 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:08.624796 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.624780 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:08.624796 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.624791 2561 projected.go:194] Error preparing data for projected volume kube-api-access-7dpwd for pod openshift-network-diagnostics/network-check-target-bclp4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:08.625029 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.624818 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:09.624800074 +0000 UTC m=+4.178234224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:08.625029 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.624845 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd podName:2f8d3b70-fe21-4feb-a984-12133895766b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:09.624832326 +0000 UTC m=+4.178266472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7dpwd" (UniqueName: "kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd") pod "network-check-target-bclp4" (UID: "2f8d3b70-fe21-4feb-a984-12133895766b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:08.782153 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.782117 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b541e1a_ac49_4260_93b3_d5e6e7e04eb5.slice/crio-6a96d5dfec7478be2f9b37ea3efefdb5cff530207e6c93e3916d7b952fd2d714 WatchSource:0}: Error finding container 6a96d5dfec7478be2f9b37ea3efefdb5cff530207e6c93e3916d7b952fd2d714: Status 404 returned error can't find the container with id 6a96d5dfec7478be2f9b37ea3efefdb5cff530207e6c93e3916d7b952fd2d714 Apr 23 16:35:08.783743 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.783717 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b73ac93_0681_4ba9_8f35_d63197135397.slice/crio-d1ae0f672c67c4f6e4d1c73fc929fbcb17bb79dbc80c4535a01e872caf008d91 WatchSource:0}: Error finding container d1ae0f672c67c4f6e4d1c73fc929fbcb17bb79dbc80c4535a01e872caf008d91: Status 404 returned error can't find the container with id d1ae0f672c67c4f6e4d1c73fc929fbcb17bb79dbc80c4535a01e872caf008d91 Apr 23 16:35:08.784990 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.784970 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51bf838a_6ae8_4ed3_bdd7_7040c25cac11.slice/crio-6ab125abfc2390a50ff72f37e11f27dde994be8da77b13d8a6ae26193d977c20 WatchSource:0}: Error finding container 6ab125abfc2390a50ff72f37e11f27dde994be8da77b13d8a6ae26193d977c20: Status 404 returned error can't find the container with id 6ab125abfc2390a50ff72f37e11f27dde994be8da77b13d8a6ae26193d977c20 Apr 23 16:35:08.787776 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.787753 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b78ad8_b01b_45a1_b022_80a305241f06.slice/crio-166ba2bd42deec7ec8dc01c8e545ea82c61e1d4dbdf49eb40b68d288370533d3 WatchSource:0}: Error finding container 166ba2bd42deec7ec8dc01c8e545ea82c61e1d4dbdf49eb40b68d288370533d3: Status 404 returned error can't find the container with id 166ba2bd42deec7ec8dc01c8e545ea82c61e1d4dbdf49eb40b68d288370533d3 Apr 23 16:35:08.789165 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.789068 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b9f6eca_1a5a_42f3_acf5_01a9d352a780.slice/crio-695565edd6a9d55b98bbbba2d8b7da57463da8b9e3f03eef6b87fbea3c4d7764 WatchSource:0}: Error finding container 695565edd6a9d55b98bbbba2d8b7da57463da8b9e3f03eef6b87fbea3c4d7764: Status 404 returned error can't find the container with id 695565edd6a9d55b98bbbba2d8b7da57463da8b9e3f03eef6b87fbea3c4d7764 Apr 23 16:35:08.790269 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.789804 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842de44e_cb5c_472a_9cf5_7d48346188d8.slice/crio-a8c7766e35dbddf897e5bae786318a294a3f92729c994d2c7fe55d394b774133 WatchSource:0}: Error finding container a8c7766e35dbddf897e5bae786318a294a3f92729c994d2c7fe55d394b774133: Status 404 returned error can't find the container with id a8c7766e35dbddf897e5bae786318a294a3f92729c994d2c7fe55d394b774133 Apr 23 16:35:08.790531 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.790504 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f8739d4_2dae_4fa1_b0be_80c6e24dce30.slice/crio-09920524134e77664c1e56f5c9a78d556ae98676a40d361777fc7e20fd7eb6eb WatchSource:0}: Error finding container 09920524134e77664c1e56f5c9a78d556ae98676a40d361777fc7e20fd7eb6eb: Status 404 returned error can't find the container with id 09920524134e77664c1e56f5c9a78d556ae98676a40d361777fc7e20fd7eb6eb Apr 23 16:35:08.792162 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.792128 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod543e930d_5825_406c_b2da_236f7eef2b83.slice/crio-c0ee5c462d0acbd1ef08dfd92726a292a4d953afd597f35dc86d09b6628bc4c7 WatchSource:0}: Error finding container c0ee5c462d0acbd1ef08dfd92726a292a4d953afd597f35dc86d09b6628bc4c7: Status 404 returned error can't find the container with id c0ee5c462d0acbd1ef08dfd92726a292a4d953afd597f35dc86d09b6628bc4c7 Apr 23 16:35:08.793050 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:08.793034 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fd1b1de_6101_4e7d_a523_325e848a740a.slice/crio-d63b7c0f5c5a5b5d887c4f7e1e8d880bf92622bd4d30a69ef9360c697efc958a WatchSource:0}: Error finding container d63b7c0f5c5a5b5d887c4f7e1e8d880bf92622bd4d30a69ef9360c697efc958a: Status 404 returned error can't find the container with id d63b7c0f5c5a5b5d887c4f7e1e8d880bf92622bd4d30a69ef9360c697efc958a Apr 23 16:35:08.829778 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.829760 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ng5vn"] Apr 23 16:35:08.831927 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.831911 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:08.831997 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:08.831976 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:08.926408 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.926257 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:08.926408 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.926379 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/867d699c-6a2c-41bb-b885-26b9cd952983-dbus\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:08.926744 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.926428 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/867d699c-6a2c-41bb-b885-26b9cd952983-kubelet-config\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:08.958502 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.958477 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:06 +0000 UTC" deadline="2028-02-04 13:54:50.114814888 +0000 UTC" Apr 23 16:35:08.958502 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:08.958500 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15645h19m41.156316834s" Apr 23 16:35:09.026913 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.026893 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/867d699c-6a2c-41bb-b885-26b9cd952983-kubelet-config\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:09.026996 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.026933 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:09.026996 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.026950 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/867d699c-6a2c-41bb-b885-26b9cd952983-dbus\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:09.027105 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.027021 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/867d699c-6a2c-41bb-b885-26b9cd952983-kubelet-config\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:09.027105 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.027064 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:09.027183 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.027091 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/867d699c-6a2c-41bb-b885-26b9cd952983-dbus\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:09.027183 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.027135 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret podName:867d699c-6a2c-41bb-b885-26b9cd952983 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:09.527115856 +0000 UTC m=+4.080550013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret") pod "global-pull-secret-syncer-ng5vn" (UID: "867d699c-6a2c-41bb-b885-26b9cd952983") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:09.027260 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.027240 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:09.027364 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.027345 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:09.034313 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.034293 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-476qf" event={"ID":"842de44e-cb5c-472a-9cf5-7d48346188d8","Type":"ContainerStarted","Data":"a8c7766e35dbddf897e5bae786318a294a3f92729c994d2c7fe55d394b774133"} Apr 23 16:35:09.035150 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.035131 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerStarted","Data":"695565edd6a9d55b98bbbba2d8b7da57463da8b9e3f03eef6b87fbea3c4d7764"} Apr 23 16:35:09.036036 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.036017 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hvjxm" event={"ID":"0b73ac93-0681-4ba9-8f35-d63197135397","Type":"ContainerStarted","Data":"d1ae0f672c67c4f6e4d1c73fc929fbcb17bb79dbc80c4535a01e872caf008d91"} Apr 23 16:35:09.036981 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.036949 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" event={"ID":"c9b78ad8-b01b-45a1-b022-80a305241f06","Type":"ContainerStarted","Data":"166ba2bd42deec7ec8dc01c8e545ea82c61e1d4dbdf49eb40b68d288370533d3"} Apr 23 16:35:09.037915 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.037896 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8c5j2" event={"ID":"51bf838a-6ae8-4ed3-bdd7-7040c25cac11","Type":"ContainerStarted","Data":"6ab125abfc2390a50ff72f37e11f27dde994be8da77b13d8a6ae26193d977c20"} Apr 23 16:35:09.039158 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.039130 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-24kss" event={"ID":"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5","Type":"ContainerStarted","Data":"6a96d5dfec7478be2f9b37ea3efefdb5cff530207e6c93e3916d7b952fd2d714"} Apr 23 16:35:09.040942 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.040923 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" event={"ID":"11fa0aa737c019b5fa81fed20955549f","Type":"ContainerStarted","Data":"a5833451ac1f1e26b7f02bcb7fdd769f29d4d3f12e02aa78cee6707be4fe64be"} Apr 23 16:35:09.041958 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.041936 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7wntm" event={"ID":"9fd1b1de-6101-4e7d-a523-325e848a740a","Type":"ContainerStarted","Data":"d63b7c0f5c5a5b5d887c4f7e1e8d880bf92622bd4d30a69ef9360c697efc958a"} Apr 23 16:35:09.042844 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.042826 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" event={"ID":"543e930d-5825-406c-b2da-236f7eef2b83","Type":"ContainerStarted","Data":"c0ee5c462d0acbd1ef08dfd92726a292a4d953afd597f35dc86d09b6628bc4c7"} Apr 23 16:35:09.043831 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.043803 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"09920524134e77664c1e56f5c9a78d556ae98676a40d361777fc7e20fd7eb6eb"} Apr 23 16:35:09.055088 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.055046 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-189.ec2.internal" podStartSLOduration=2.055031643 podStartE2EDuration="2.055031643s" podCreationTimestamp="2026-04-23 16:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:09.054895235 +0000 UTC m=+3.608329400" watchObservedRunningTime="2026-04-23 16:35:09.055031643 +0000 UTC m=+3.608465813" Apr 23 16:35:09.533908 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.531009 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:09.533908 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.531150 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:09.533908 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.531210 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret podName:867d699c-6a2c-41bb-b885-26b9cd952983 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:10.531191899 +0000 UTC m=+5.084626049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret") pod "global-pull-secret-syncer-ng5vn" (UID: "867d699c-6a2c-41bb-b885-26b9cd952983") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:09.633980 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.633168 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:09.633980 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:09.633219 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:09.633980 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.633377 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:09.633980 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.633435 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.633417451 +0000 UTC m=+6.186851600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:09.633980 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.633513 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:09.633980 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.633584 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:09.633980 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.633597 2561 projected.go:194] Error preparing data for projected volume kube-api-access-7dpwd for pod openshift-network-diagnostics/network-check-target-bclp4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:09.633980 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:09.633641 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd podName:2f8d3b70-fe21-4feb-a984-12133895766b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.633627809 +0000 UTC m=+6.187061956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7dpwd" (UniqueName: "kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd") pod "network-check-target-bclp4" (UID: "2f8d3b70-fe21-4feb-a984-12133895766b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:10.032935 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:10.032368 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:10.032935 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:10.032507 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:10.060233 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:10.060154 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" event={"ID":"17dbb9da6ea0f39e403c5d7aceab008d","Type":"ContainerStarted","Data":"4a9447bf3cdf4e83df847dbee6e5c4fa1d4406b32d53f4f1a97af98aa4ae4cb0"} Apr 23 16:35:10.541280 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:10.541245 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:10.541468 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:10.541428 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:10.541534 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:10.541486 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret podName:867d699c-6a2c-41bb-b885-26b9cd952983 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:12.541469112 +0000 UTC m=+7.094903272 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret") pod "global-pull-secret-syncer-ng5vn" (UID: "867d699c-6a2c-41bb-b885-26b9cd952983") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:11.027750 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:11.027713 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:11.027996 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:11.027841 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:11.028357 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:11.028335 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:11.028498 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:11.028441 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:11.074160 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:11.073444 2561 generic.go:358] "Generic (PLEG): container finished" podID="17dbb9da6ea0f39e403c5d7aceab008d" containerID="4a9447bf3cdf4e83df847dbee6e5c4fa1d4406b32d53f4f1a97af98aa4ae4cb0" exitCode=0 Apr 23 16:35:11.074160 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:11.073494 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" event={"ID":"17dbb9da6ea0f39e403c5d7aceab008d","Type":"ContainerDied","Data":"4a9447bf3cdf4e83df847dbee6e5c4fa1d4406b32d53f4f1a97af98aa4ae4cb0"} Apr 23 16:35:11.650989 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:11.650953 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:11.651178 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:11.651006 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:11.651248 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:11.651189 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:11.651305 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:11.651249 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:15.651230268 +0000 UTC m=+10.204664430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:11.651751 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:11.651724 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:11.651751 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:11.651748 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:11.651925 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:11.651761 2561 projected.go:194] Error preparing data for projected volume kube-api-access-7dpwd for pod openshift-network-diagnostics/network-check-target-bclp4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:11.651925 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:11.651804 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd podName:2f8d3b70-fe21-4feb-a984-12133895766b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:15.651789815 +0000 UTC m=+10.205223978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7dpwd" (UniqueName: "kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd") pod "network-check-target-bclp4" (UID: "2f8d3b70-fe21-4feb-a984-12133895766b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:12.028239 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:12.028205 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:12.028406 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:12.028351 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:12.559895 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:12.559341 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:12.559895 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:12.559495 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:12.559895 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:12.559554 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret podName:867d699c-6a2c-41bb-b885-26b9cd952983 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:16.559537261 +0000 UTC m=+11.112971412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret") pod "global-pull-secret-syncer-ng5vn" (UID: "867d699c-6a2c-41bb-b885-26b9cd952983") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:13.027739 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:13.027658 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:13.027926 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:13.027785 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:13.028204 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:13.028185 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:13.028295 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:13.028275 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:14.027804 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:14.027760 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:14.028276 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:14.027895 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:15.028007 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:15.027981 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:15.028448 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:15.028008 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:15.028448 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:15.028110 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:15.028448 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:15.028210 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:15.683891 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:15.683457 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:15.683891 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:15.683510 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:15.683891 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:15.683644 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:15.683891 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:15.683675 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:15.683891 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:15.683703 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:15.683891 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:15.683711 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.683690877 +0000 UTC m=+18.237125026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:15.683891 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:15.683719 2561 projected.go:194] Error preparing data for projected volume kube-api-access-7dpwd for pod openshift-network-diagnostics/network-check-target-bclp4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:15.683891 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:15.683774 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd podName:2f8d3b70-fe21-4feb-a984-12133895766b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.683757409 +0000 UTC m=+18.237191557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7dpwd" (UniqueName: "kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd") pod "network-check-target-bclp4" (UID: "2f8d3b70-fe21-4feb-a984-12133895766b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:16.029670 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:16.029193 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:16.029670 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:16.029319 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:16.591636 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:16.591596 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:16.591807 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:16.591749 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:16.591890 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:16.591813 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret podName:867d699c-6a2c-41bb-b885-26b9cd952983 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:24.591795019 +0000 UTC m=+19.145229166 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret") pod "global-pull-secret-syncer-ng5vn" (UID: "867d699c-6a2c-41bb-b885-26b9cd952983") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:17.027750 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:17.027615 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:17.027939 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:17.027746 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:17.027939 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:17.027807 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:17.028051 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:17.027934 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:18.028124 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:18.028097 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:18.028523 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:18.028235 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:19.027889 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:19.027843 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:19.028047 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:19.027974 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:19.028047 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:19.028007 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:19.028156 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:19.028115 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:20.028340 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:20.028302 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:20.028781 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:20.028444 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:21.027253 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:21.027218 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:21.027409 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:21.027307 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:21.027409 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:21.027218 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:21.027409 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:21.027364 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:22.028169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:22.028140 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:22.028628 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:22.028275 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:23.028003 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:23.027972 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:23.028263 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:23.027975 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:23.028263 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:23.028095 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:23.028263 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:23.028143 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:23.746181 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:23.746150 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:23.746353 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:23.746195 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:23.746353 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:23.746306 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.746353 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:23.746334 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:23.746482 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:23.746360 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:39.746343288 +0000 UTC m=+34.299777437 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.746482 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:23.746359 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:23.746482 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:23.746375 2561 projected.go:194] Error preparing data for projected volume kube-api-access-7dpwd for pod openshift-network-diagnostics/network-check-target-bclp4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:23.746482 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:23.746399 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd podName:2f8d3b70-fe21-4feb-a984-12133895766b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:39.746390853 +0000 UTC m=+34.299825000 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7dpwd" (UniqueName: "kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd") pod "network-check-target-bclp4" (UID: "2f8d3b70-fe21-4feb-a984-12133895766b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:24.028285 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:24.028252 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:24.028743 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:24.028404 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:24.653469 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:24.653436 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:24.653659 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:24.653562 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:24.653659 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:24.653630 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret podName:867d699c-6a2c-41bb-b885-26b9cd952983 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:40.65361266 +0000 UTC m=+35.207046823 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret") pod "global-pull-secret-syncer-ng5vn" (UID: "867d699c-6a2c-41bb-b885-26b9cd952983") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:25.027562 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:25.027493 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:25.027708 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:25.027597 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:25.027708 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:25.027625 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:25.027810 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:25.027705 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:26.028624 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.027969 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:26.029279 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:26.029246 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:26.096338 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.096167 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" event={"ID":"17dbb9da6ea0f39e403c5d7aceab008d","Type":"ContainerStarted","Data":"0c3244a2f3b8caf2f6c97295201dd7ea50bf6e8384febbce047013d4de11948a"} Apr 23 16:35:26.101751 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.101617 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-24kss" event={"ID":"6b541e1a-ac49-4260-93b3-d5e6e7e04eb5","Type":"ContainerStarted","Data":"1e11fab4739ffc5a09c12a42583d0b227de9abbf63958fecfca7a49f07ea73ea"} Apr 23 16:35:26.103435 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.103399 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7wntm" event={"ID":"9fd1b1de-6101-4e7d-a523-325e848a740a","Type":"ContainerStarted","Data":"0e4367472bc3cfb24faf56ccb81b9e58698e566628a3ccd6b7a0f448a893f4d8"} Apr 23 16:35:26.105363 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.104967 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" event={"ID":"543e930d-5825-406c-b2da-236f7eef2b83","Type":"ContainerStarted","Data":"bffca86f438bf4cbaccbe766432eae4694adf44f7f5ad4a4fb866c42c5e75811"} Apr 23 16:35:26.107892 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.107408 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"d13a56d8b44ad18fc4d4fe0682224ee3ed5a03e2244d690bfdcc3de50f39cc4e"} Apr 23 16:35:26.107892 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.107432 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"251cd4aea2c31e6d4cbabcb7cf600b34b50d6d389243cc4d6d6d493bf0c0f4a6"} Apr 23 16:35:26.109157 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.109132 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-476qf" event={"ID":"842de44e-cb5c-472a-9cf5-7d48346188d8","Type":"ContainerStarted","Data":"3942f33fdeef4a4618eacc06c349c59867a796bc0c1b75387243f9e74d043fa8"} Apr 23 16:35:26.112341 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.112155 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerStarted","Data":"bb8658c17f8d13a11636bb0d64d694f4abe6914e7cfe808b40b0944f25b8fa34"} Apr 23 16:35:26.124668 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.124621 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7wntm" podStartSLOduration=3.199665572 podStartE2EDuration="20.124605929s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.795090949 +0000 UTC m=+3.348525103" lastFinishedPulling="2026-04-23 16:35:25.720031298 +0000 UTC m=+20.273465460" observedRunningTime="2026-04-23 16:35:26.123850522 +0000 UTC m=+20.677284690" watchObservedRunningTime="2026-04-23 16:35:26.124605929 +0000 UTC m=+20.678040099" Apr 23 16:35:26.124762 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.124716 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-189.ec2.internal" podStartSLOduration=19.124710694 podStartE2EDuration="19.124710694s" podCreationTimestamp="2026-04-23 16:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:26.110335216 +0000 UTC m=+20.663769386" watchObservedRunningTime="2026-04-23 16:35:26.124710694 +0000 UTC m=+20.678144865" Apr 23 16:35:26.149792 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.149756 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-24kss" podStartSLOduration=2.990058887 podStartE2EDuration="20.149743697s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.783746081 +0000 UTC m=+3.337180227" lastFinishedPulling="2026-04-23 16:35:25.943430876 +0000 UTC m=+20.496865037" observedRunningTime="2026-04-23 16:35:26.149270572 +0000 UTC m=+20.702704741" watchObservedRunningTime="2026-04-23 16:35:26.149743697 +0000 UTC m=+20.703177866" Apr 23 16:35:26.176142 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.176097 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gmwdt" podStartSLOduration=3.226940818 podStartE2EDuration="20.176085039s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.794177652 +0000 UTC m=+3.347611800" lastFinishedPulling="2026-04-23 16:35:25.743321857 +0000 UTC m=+20.296756021" observedRunningTime="2026-04-23 16:35:26.175849934 +0000 UTC m=+20.729284104" watchObservedRunningTime="2026-04-23 16:35:26.176085039 +0000 UTC m=+20.729519207" Apr 23 16:35:26.237732 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.237688 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-476qf" podStartSLOduration=3.290108996 podStartE2EDuration="20.237672811s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.793105408 +0000 UTC m=+3.346539558" lastFinishedPulling="2026-04-23 16:35:25.740669213 +0000 UTC m=+20.294103373" observedRunningTime="2026-04-23 16:35:26.195596522 +0000 UTC m=+20.749030692" watchObservedRunningTime="2026-04-23 16:35:26.237672811 +0000 UTC m=+20.791106979" Apr 23 16:35:26.970416 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:26.970390 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:27.027889 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.027854 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:27.027987 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.027862 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:27.027987 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:27.027951 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:27.028109 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:27.028017 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:27.115587 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.115557 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"8127ebdbf072f46c0f3f7beeba7422ba39b3343c9d290f526e8092d3c32f1849"} Apr 23 16:35:27.116218 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.115591 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"4faf072d959eace77702c365f51780d3826ffe8f9c043d841970223765136556"} Apr 23 16:35:27.116218 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.115616 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"e185a0bbe6792ea9d7ee8a57b221dd8507951f94321acbdbc033263588d46ea0"} Apr 23 16:35:27.116218 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.115628 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"d145d7f2b4395df1c3b1800f1037c2b53b38416059ff211a81d82e46ee468bf6"} Apr 23 16:35:27.116832 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.116808 2561 generic.go:358] "Generic (PLEG): container finished" podID="3b9f6eca-1a5a-42f3-acf5-01a9d352a780" containerID="bb8658c17f8d13a11636bb0d64d694f4abe6914e7cfe808b40b0944f25b8fa34" exitCode=0 Apr 23 16:35:27.116919 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.116889 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerDied","Data":"bb8658c17f8d13a11636bb0d64d694f4abe6914e7cfe808b40b0944f25b8fa34"} Apr 23 16:35:27.118213 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.118114 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hvjxm" event={"ID":"0b73ac93-0681-4ba9-8f35-d63197135397","Type":"ContainerStarted","Data":"b6cc7b56a239a6473db52547bc72b184886195cdc61c1672e41e78be73d00130"} Apr 23 16:35:27.119707 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.119689 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" event={"ID":"c9b78ad8-b01b-45a1-b022-80a305241f06","Type":"ContainerStarted","Data":"977c5e12c6e31214c4d88b2e326f8bd5531026abf6e29caa1ab357db35a223b3"} Apr 23 16:35:27.119792 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.119709 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" event={"ID":"c9b78ad8-b01b-45a1-b022-80a305241f06","Type":"ContainerStarted","Data":"381892db92fa7bbd373df6075aa7d252a780325424d58b27fb054d5127ddac25"} Apr 23 16:35:27.121581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.121536 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8c5j2" event={"ID":"51bf838a-6ae8-4ed3-bdd7-7040c25cac11","Type":"ContainerStarted","Data":"43f9dacc48314e8a7effeeb1d0cbb7b2d0d36eab1975d9e4650550d0625a2ac5"} Apr 23 16:35:27.155162 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.155121 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8c5j2" podStartSLOduration=8.903074297 podStartE2EDuration="21.155110249s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.787194375 +0000 UTC m=+3.340628521" lastFinishedPulling="2026-04-23 16:35:21.039230325 +0000 UTC m=+15.592664473" observedRunningTime="2026-04-23 16:35:27.154788538 +0000 UTC m=+21.708222707" watchObservedRunningTime="2026-04-23 16:35:27.155110249 +0000 UTC m=+21.708544461" Apr 23 16:35:27.173695 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.173657 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hvjxm" podStartSLOduration=4.239181869 podStartE2EDuration="21.173647707s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.785562548 +0000 UTC m=+3.338996694" lastFinishedPulling="2026-04-23 16:35:25.720028374 +0000 UTC m=+20.273462532" observedRunningTime="2026-04-23 16:35:27.173332025 +0000 UTC m=+21.726766193" watchObservedRunningTime="2026-04-23 16:35:27.173647707 +0000 UTC m=+21.727081875" Apr 23 16:35:27.529020 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.528981 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:27.530113 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.530093 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:27.966948 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.966788 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:26.970412713Z","UUID":"368ee74a-2852-4683-8a44-424bae6d1594","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:27.968476 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.968457 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:27.968576 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:27.968486 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:28.028289 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:28.028259 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:28.028399 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:28.028381 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:28.125001 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:28.124924 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" event={"ID":"c9b78ad8-b01b-45a1-b022-80a305241f06","Type":"ContainerStarted","Data":"842f2cbe3e5ad845f6ccae5ae64c2bda71b4b5311142625c5559332e4346f387"} Apr 23 16:35:28.158140 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:28.158101 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pzwh" podStartSLOduration=3.184735691 podStartE2EDuration="22.158087633s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.789690074 +0000 UTC m=+3.343124228" lastFinishedPulling="2026-04-23 16:35:27.763042014 +0000 UTC m=+22.316476170" observedRunningTime="2026-04-23 16:35:28.156596671 +0000 UTC m=+22.710030839" watchObservedRunningTime="2026-04-23 16:35:28.158087633 +0000 UTC m=+22.711521803" Apr 23 16:35:29.028075 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:29.027905 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:29.028252 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:29.027924 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:29.028252 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:29.028161 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:29.028252 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:29.028242 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:29.129346 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:29.129301 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"0565af9781e322c34630016b9a1bc4d3a3a63d71aeaa32e7985eeed553ec06be"} Apr 23 16:35:29.129753 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:29.129405 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 16:35:30.027892 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:30.027844 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:30.028057 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:30.028008 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:31.028308 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.027902 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:31.028308 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:31.028028 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:31.030473 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.028395 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:31.030473 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:31.028496 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:31.137263 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.137056 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" event={"ID":"1f8739d4-2dae-4fa1-b0be-80c6e24dce30","Type":"ContainerStarted","Data":"923567f10e26230156df6fe0811d67ad5607471e156e08f623e0009c2d9cd0d0"} Apr 23 16:35:31.137409 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.137340 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:31.137409 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.137364 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:31.139067 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.139032 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerStarted","Data":"91e82346e95ad03bb019f1f6372a3bbc9de93368ed8665af630f631483a7f234"} Apr 23 16:35:31.151956 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.151934 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:31.152050 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.152001 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:31.170492 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:31.170455 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" podStartSLOduration=8.171433228 podStartE2EDuration="25.170443691s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.792728554 +0000 UTC m=+3.346162701" lastFinishedPulling="2026-04-23 16:35:25.791738998 +0000 UTC m=+20.345173164" observedRunningTime="2026-04-23 16:35:31.169796542 +0000 UTC m=+25.723230738" watchObservedRunningTime="2026-04-23 16:35:31.170443691 +0000 UTC m=+25.723877860" Apr 23 16:35:32.030534 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.030503 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:32.030869 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:32.030596 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:32.143058 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.143022 2561 generic.go:358] "Generic (PLEG): container finished" podID="3b9f6eca-1a5a-42f3-acf5-01a9d352a780" containerID="91e82346e95ad03bb019f1f6372a3bbc9de93368ed8665af630f631483a7f234" exitCode=0 Apr 23 16:35:32.143285 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.143268 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 16:35:32.144802 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.144769 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerDied","Data":"91e82346e95ad03bb019f1f6372a3bbc9de93368ed8665af630f631483a7f234"} Apr 23 16:35:32.974341 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.974316 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bclp4"] Apr 23 16:35:32.974458 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.974445 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:32.974565 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:32.974546 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:32.977402 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.977380 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rg2cr"] Apr 23 16:35:32.977502 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.977471 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:32.977598 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:32.977576 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:32.980506 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.980484 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ng5vn"] Apr 23 16:35:32.980590 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:32.980576 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:32.980698 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:32.980678 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:33.147099 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:33.147009 2561 generic.go:358] "Generic (PLEG): container finished" podID="3b9f6eca-1a5a-42f3-acf5-01a9d352a780" containerID="3958116bbd522fb7f4981cc4234fe5d58b8298b2d1a63c7b35622c365d8e8599" exitCode=0 Apr 23 16:35:33.147435 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:33.147090 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerDied","Data":"3958116bbd522fb7f4981cc4234fe5d58b8298b2d1a63c7b35622c365d8e8599"} Apr 23 16:35:33.147435 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:33.147343 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 16:35:34.150280 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:34.150249 2561 generic.go:358] "Generic (PLEG): container finished" podID="3b9f6eca-1a5a-42f3-acf5-01a9d352a780" containerID="dbfad1c05b64feebb7d0dbecd4dbae8126aa65ba00fdb6c55feb6b50c9bfce8c" exitCode=0 Apr 23 16:35:34.150633 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:34.150298 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerDied","Data":"dbfad1c05b64feebb7d0dbecd4dbae8126aa65ba00fdb6c55feb6b50c9bfce8c"} Apr 23 16:35:34.645780 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:34.645751 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:35:35.027992 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:35.027830 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:35.028104 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:35.027870 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:35.028104 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:35.028083 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:35.028212 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:35.028146 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:35.028212 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:35.027926 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:35.028303 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:35.028227 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:36.677506 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:36.677472 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:36.677981 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:36.677618 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 16:35:36.678252 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:36.678228 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8c5j2" Apr 23 16:35:37.027905 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:37.027857 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:37.027905 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:37.027898 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:37.028111 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:37.027857 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:37.028111 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:37.028005 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bclp4" podUID="2f8d3b70-fe21-4feb-a984-12133895766b" Apr 23 16:35:37.028111 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:37.028060 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ng5vn" podUID="867d699c-6a2c-41bb-b885-26b9cd952983" Apr 23 16:35:37.028213 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:37.028135 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:35:38.760950 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.760860 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-189.ec2.internal" event="NodeReady" Apr 23 16:35:38.761418 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.761026 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:38.830765 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.829463 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f59db74c9-f44nw"] Apr 23 16:35:38.854472 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.854440 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-49hdt"] Apr 23 16:35:38.854650 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.854628 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:38.864871 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.864490 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 16:35:38.864871 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.864513 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 16:35:38.864871 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.864547 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:35:38.865903 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.865870 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f8556\"" Apr 23 16:35:38.866532 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.866510 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cdjpw"] Apr 23 16:35:38.866790 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.866765 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:38.871439 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.871419 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 16:35:38.872921 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.872901 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:38.873020 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.872972 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:38.874601 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.874573 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s982g\"" Apr 23 16:35:38.874682 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.874656 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:38.881449 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.881430 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f59db74c9-f44nw"] Apr 23 16:35:38.881548 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.881460 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-49hdt"] Apr 23 16:35:38.881600 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.881569 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:38.885572 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.885533 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:38.885793 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.885573 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jmk6t\"" Apr 23 16:35:38.885793 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.885574 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:38.901772 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.901752 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdjpw"] Apr 23 16:35:38.961722 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.961689 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-installation-pull-secrets\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:38.961919 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.961754 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjwlf\" (UniqueName: \"kubernetes.io/projected/c2a76a12-982a-438c-837c-0e7665a6f46c-kube-api-access-cjwlf\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:38.961919 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.961819 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5kn\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-kube-api-access-6n5kn\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:38.961919 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.961852 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f0e9f3d-ecd0-4e57-8ef1-447361404429-config-volume\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:38.961919 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.961902 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-trusted-ca\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:38.962145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.961925 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f0e9f3d-ecd0-4e57-8ef1-447361404429-tmp-dir\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:38.962145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.961984 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-ca-trust-extracted\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:38.962145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.962010 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:38.962145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.962033 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tggxd\" (UniqueName: \"kubernetes.io/projected/0f0e9f3d-ecd0-4e57-8ef1-447361404429-kube-api-access-tggxd\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:38.962145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.962073 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-image-registry-private-configuration\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:38.962145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.962105 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-bound-sa-token\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:38.962395 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.962162 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:38.962395 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.962192 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:38.962395 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:38.962221 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-certificates\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.027933 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.027906 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:39.028147 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.027941 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:39.028147 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.028100 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:39.033563 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.033539 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:39.034068 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.033821 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:39.034068 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.033838 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f89h4\"" Apr 23 16:35:39.034068 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.033973 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:35:39.034068 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.033991 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6xqf4\"" Apr 23 16:35:39.034068 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.033983 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:39.061006 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.060986 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4"] Apr 23 16:35:39.063265 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063244 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-image-registry-private-configuration\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.063381 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063273 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-bound-sa-token\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.063471 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063451 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.063534 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063498 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:39.063534 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063526 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-certificates\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.063635 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063547 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-installation-pull-secrets\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.063635 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.063570 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:39.063635 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.063586 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f59db74c9-f44nw: secret "image-registry-tls" not found Apr 23 16:35:39.063635 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063589 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjwlf\" (UniqueName: \"kubernetes.io/projected/c2a76a12-982a-438c-837c-0e7665a6f46c-kube-api-access-cjwlf\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:39.063635 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063613 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5kn\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-kube-api-access-6n5kn\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.063639 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.063644 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls podName:9effa94c-05cd-4a19-9dac-bcfc8c8f18f4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:39.563626112 +0000 UTC m=+34.117060263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls") pod "image-registry-6f59db74c9-f44nw" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4") : secret "image-registry-tls" not found Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063669 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f0e9f3d-ecd0-4e57-8ef1-447361404429-config-volume\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.063683 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls podName:0f0e9f3d-ecd0-4e57-8ef1-447361404429 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:39.563670359 +0000 UTC m=+34.117104507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls") pod "dns-default-cdjpw" (UID: "0f0e9f3d-ecd0-4e57-8ef1-447361404429") : secret "dns-default-metrics-tls" not found Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063712 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-trusted-ca\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063740 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f0e9f3d-ecd0-4e57-8ef1-447361404429-tmp-dir\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063799 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-ca-trust-extracted\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063824 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:39.063894 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.063849 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tggxd\" (UniqueName: \"kubernetes.io/projected/0f0e9f3d-ecd0-4e57-8ef1-447361404429-kube-api-access-tggxd\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:39.064332 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.064116 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f0e9f3d-ecd0-4e57-8ef1-447361404429-tmp-dir\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:39.064332 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.064171 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:39.064332 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.064217 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert podName:c2a76a12-982a-438c-837c-0e7665a6f46c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:39.564201157 +0000 UTC m=+34.117635306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert") pod "ingress-canary-49hdt" (UID: "c2a76a12-982a-438c-837c-0e7665a6f46c") : secret "canary-serving-cert" not found Apr 23 16:35:39.065388 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.064228 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f0e9f3d-ecd0-4e57-8ef1-447361404429-config-volume\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:39.065388 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.065374 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-certificates\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.066059 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.066036 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-ca-trust-extracted\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.066290 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.066266 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-trusted-ca\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.068987 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.067951 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-installation-pull-secrets\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.068987 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.067965 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-image-registry-private-configuration\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.091519 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.091498 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.092304 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.092281 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4"] Apr 23 16:35:39.094603 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.094567 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-bound-sa-token\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.097590 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.097496 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 16:35:39.097684 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.097598 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 16:35:39.097684 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.097636 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 16:35:39.097909 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.097841 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 16:35:39.112958 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.112931 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh"] Apr 23 16:35:39.114483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.114463 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tggxd\" (UniqueName: \"kubernetes.io/projected/0f0e9f3d-ecd0-4e57-8ef1-447361404429-kube-api-access-tggxd\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:39.114574 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.114533 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5kn\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-kube-api-access-6n5kn\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.114680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.114662 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjwlf\" (UniqueName: \"kubernetes.io/projected/c2a76a12-982a-438c-837c-0e7665a6f46c-kube-api-access-cjwlf\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:39.132891 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.132855 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8"] Apr 23 16:35:39.133050 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.133011 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.136687 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.136666 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 16:35:39.136788 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.136690 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 16:35:39.136999 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.136979 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 16:35:39.136999 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.136998 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 16:35:39.151285 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.151247 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh"] Apr 23 16:35:39.151392 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.151292 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8"] Apr 23 16:35:39.151392 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.151295 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" Apr 23 16:35:39.154992 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.154967 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-776vt\"" Apr 23 16:35:39.155077 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.155012 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 16:35:39.265924 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.265869 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.266116 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.265934 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9hz9\" (UniqueName: \"kubernetes.io/projected/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-kube-api-access-k9hz9\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.266116 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266021 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/304e7398-03f0-41a8-8a0f-43ae08b2760a-tmp\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.266116 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266051 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24mmb\" (UniqueName: \"kubernetes.io/projected/304e7398-03f0-41a8-8a0f-43ae08b2760a-kube-api-access-24mmb\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.266116 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266081 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n47hh\" (UniqueName: \"kubernetes.io/projected/704a1757-9f37-46f2-bf18-a099b1493b45-kube-api-access-n47hh\") pod \"managed-serviceaccount-addon-agent-785997c65f-bzbg8\" (UID: \"704a1757-9f37-46f2-bf18-a099b1493b45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" Apr 23 16:35:39.266304 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266177 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.266304 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266207 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-ca\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.266304 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266243 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-hub\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.266304 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266263 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/704a1757-9f37-46f2-bf18-a099b1493b45-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-785997c65f-bzbg8\" (UID: \"704a1757-9f37-46f2-bf18-a099b1493b45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" Apr 23 16:35:39.266483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266308 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.266483 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.266328 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/304e7398-03f0-41a8-8a0f-43ae08b2760a-klusterlet-config\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.367706 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.367629 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.367706 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.367676 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9hz9\" (UniqueName: \"kubernetes.io/projected/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-kube-api-access-k9hz9\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.367999 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.367816 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/304e7398-03f0-41a8-8a0f-43ae08b2760a-tmp\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.367999 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.367864 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24mmb\" (UniqueName: \"kubernetes.io/projected/304e7398-03f0-41a8-8a0f-43ae08b2760a-kube-api-access-24mmb\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.367999 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.367905 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n47hh\" (UniqueName: \"kubernetes.io/projected/704a1757-9f37-46f2-bf18-a099b1493b45-kube-api-access-n47hh\") pod \"managed-serviceaccount-addon-agent-785997c65f-bzbg8\" (UID: \"704a1757-9f37-46f2-bf18-a099b1493b45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" Apr 23 16:35:39.367999 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.367978 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.368197 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.368099 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-ca\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.368197 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.368147 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-hub\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.368197 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.368175 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/704a1757-9f37-46f2-bf18-a099b1493b45-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-785997c65f-bzbg8\" (UID: \"704a1757-9f37-46f2-bf18-a099b1493b45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" Apr 23 16:35:39.368351 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.368208 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.368351 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.368223 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/304e7398-03f0-41a8-8a0f-43ae08b2760a-tmp\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.368351 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.368301 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/304e7398-03f0-41a8-8a0f-43ae08b2760a-klusterlet-config\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.368669 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.368624 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.370940 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.370914 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-ca\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.371103 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.371083 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.371103 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.371093 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/704a1757-9f37-46f2-bf18-a099b1493b45-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-785997c65f-bzbg8\" (UID: \"704a1757-9f37-46f2-bf18-a099b1493b45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" Apr 23 16:35:39.371233 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.371095 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.371448 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.371425 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/304e7398-03f0-41a8-8a0f-43ae08b2760a-klusterlet-config\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.371940 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.371921 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-hub\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.384572 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.384530 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n47hh\" (UniqueName: \"kubernetes.io/projected/704a1757-9f37-46f2-bf18-a099b1493b45-kube-api-access-n47hh\") pod \"managed-serviceaccount-addon-agent-785997c65f-bzbg8\" (UID: \"704a1757-9f37-46f2-bf18-a099b1493b45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" Apr 23 16:35:39.384670 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.384619 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24mmb\" (UniqueName: \"kubernetes.io/projected/304e7398-03f0-41a8-8a0f-43ae08b2760a-kube-api-access-24mmb\") pod \"klusterlet-addon-workmgr-544dfb5797-k44w4\" (UID: \"304e7398-03f0-41a8-8a0f-43ae08b2760a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.384966 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.384947 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9hz9\" (UniqueName: \"kubernetes.io/projected/04f18820-a137-4fa2-804e-b8a6a7bf9eb4-kube-api-access-k9hz9\") pod \"cluster-proxy-proxy-agent-7749dbf8b5-zr4nh\" (UID: \"04f18820-a137-4fa2-804e-b8a6a7bf9eb4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.406840 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.406821 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:39.449570 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.449545 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:35:39.461173 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.461150 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" Apr 23 16:35:39.569628 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.569600 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:39.569755 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.569645 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:39.569819 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.569755 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:39.569819 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.569758 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:39.569939 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.569822 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls podName:0f0e9f3d-ecd0-4e57-8ef1-447361404429 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:40.569803294 +0000 UTC m=+35.123237443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls") pod "dns-default-cdjpw" (UID: "0f0e9f3d-ecd0-4e57-8ef1-447361404429") : secret "dns-default-metrics-tls" not found Apr 23 16:35:39.569939 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.569824 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f59db74c9-f44nw: secret "image-registry-tls" not found Apr 23 16:35:39.569939 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.569857 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:39.569939 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.569896 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls podName:9effa94c-05cd-4a19-9dac-bcfc8c8f18f4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:40.569864068 +0000 UTC m=+35.123298229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls") pod "image-registry-6f59db74c9-f44nw" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4") : secret "image-registry-tls" not found Apr 23 16:35:39.570123 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.569948 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:39.570123 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.569981 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert podName:c2a76a12-982a-438c-837c-0e7665a6f46c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:40.569970013 +0000 UTC m=+35.123404164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert") pod "ingress-canary-49hdt" (UID: "c2a76a12-982a-438c-837c-0e7665a6f46c") : secret "canary-serving-cert" not found Apr 23 16:35:39.772379 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.771784 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:39.772379 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.771847 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:35:39.772379 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.772002 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:35:39.772379 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:39.772070 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:11.772048332 +0000 UTC m=+66.325482482 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : secret "metrics-daemon-secret" not found Apr 23 16:35:39.778999 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.777556 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dpwd\" (UniqueName: \"kubernetes.io/projected/2f8d3b70-fe21-4feb-a984-12133895766b-kube-api-access-7dpwd\") pod \"network-check-target-bclp4\" (UID: \"2f8d3b70-fe21-4feb-a984-12133895766b\") " pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:39.890476 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.890252 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8"] Apr 23 16:35:39.894375 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.894332 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh"] Apr 23 16:35:39.909676 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.909643 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4"] Apr 23 16:35:39.951838 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:39.951770 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:39.999032 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:39.999006 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod704a1757_9f37_46f2_bf18_a099b1493b45.slice/crio-2170718fef13858246ab001746625d5713d10281d33a8768584762ec8d5a11c5 WatchSource:0}: Error finding container 2170718fef13858246ab001746625d5713d10281d33a8768584762ec8d5a11c5: Status 404 returned error can't find the container with id 2170718fef13858246ab001746625d5713d10281d33a8768584762ec8d5a11c5 Apr 23 16:35:39.999316 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:39.999290 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04f18820_a137_4fa2_804e_b8a6a7bf9eb4.slice/crio-983df02f392063d8e927a5530495135a611c6384216da52a8c4f3b796a950247 WatchSource:0}: Error finding container 983df02f392063d8e927a5530495135a611c6384216da52a8c4f3b796a950247: Status 404 returned error can't find the container with id 983df02f392063d8e927a5530495135a611c6384216da52a8c4f3b796a950247 Apr 23 16:35:40.005551 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:40.000050 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod304e7398_03f0_41a8_8a0f_43ae08b2760a.slice/crio-d202be21fc7b6617cd01efa847f3297bac1f65205999153b3e41c0e85e640598 WatchSource:0}: Error finding container d202be21fc7b6617cd01efa847f3297bac1f65205999153b3e41c0e85e640598: Status 404 returned error can't find the container with id d202be21fc7b6617cd01efa847f3297bac1f65205999153b3e41c0e85e640598 Apr 23 16:35:40.158061 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.158023 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bclp4"] Apr 23 16:35:40.162370 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.162342 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" event={"ID":"704a1757-9f37-46f2-bf18-a099b1493b45","Type":"ContainerStarted","Data":"2170718fef13858246ab001746625d5713d10281d33a8768584762ec8d5a11c5"} Apr 23 16:35:40.163175 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.163152 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" event={"ID":"04f18820-a137-4fa2-804e-b8a6a7bf9eb4","Type":"ContainerStarted","Data":"983df02f392063d8e927a5530495135a611c6384216da52a8c4f3b796a950247"} Apr 23 16:35:40.163954 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.163936 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" event={"ID":"304e7398-03f0-41a8-8a0f-43ae08b2760a","Type":"ContainerStarted","Data":"d202be21fc7b6617cd01efa847f3297bac1f65205999153b3e41c0e85e640598"} Apr 23 16:35:40.168769 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:40.168737 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f8d3b70_fe21_4feb_a984_12133895766b.slice/crio-2a635568c79c438fae74531c1bba7b3f195808df3911e67db517e1861344e93e WatchSource:0}: Error finding container 2a635568c79c438fae74531c1bba7b3f195808df3911e67db517e1861344e93e: Status 404 returned error can't find the container with id 2a635568c79c438fae74531c1bba7b3f195808df3911e67db517e1861344e93e Apr 23 16:35:40.578356 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.578318 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:40.578554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.578375 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:40.578554 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.578459 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:40.578554 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:40.578489 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:40.578554 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:40.578512 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f59db74c9-f44nw: secret "image-registry-tls" not found Apr 23 16:35:40.578554 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:40.578514 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:40.578801 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:40.578566 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:40.578801 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:40.578578 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls podName:9effa94c-05cd-4a19-9dac-bcfc8c8f18f4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.578556634 +0000 UTC m=+37.131990783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls") pod "image-registry-6f59db74c9-f44nw" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4") : secret "image-registry-tls" not found Apr 23 16:35:40.578801 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:40.578597 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls podName:0f0e9f3d-ecd0-4e57-8ef1-447361404429 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.578587462 +0000 UTC m=+37.132021609 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls") pod "dns-default-cdjpw" (UID: "0f0e9f3d-ecd0-4e57-8ef1-447361404429") : secret "dns-default-metrics-tls" not found Apr 23 16:35:40.578801 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:40.578612 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert podName:c2a76a12-982a-438c-837c-0e7665a6f46c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.57860425 +0000 UTC m=+37.132038397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert") pod "ingress-canary-49hdt" (UID: "c2a76a12-982a-438c-837c-0e7665a6f46c") : secret "canary-serving-cert" not found Apr 23 16:35:40.679411 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.679214 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:40.686199 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.686169 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/867d699c-6a2c-41bb-b885-26b9cd952983-original-pull-secret\") pod \"global-pull-secret-syncer-ng5vn\" (UID: \"867d699c-6a2c-41bb-b885-26b9cd952983\") " pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:40.840144 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:40.840069 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ng5vn" Apr 23 16:35:41.008830 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:41.008800 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ng5vn"] Apr 23 16:35:41.012767 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:35:41.012735 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod867d699c_6a2c_41bb_b885_26b9cd952983.slice/crio-7e06491144e894f6250b279a1e47f1287e354aaace07709933da726dab4691d8 WatchSource:0}: Error finding container 7e06491144e894f6250b279a1e47f1287e354aaace07709933da726dab4691d8: Status 404 returned error can't find the container with id 7e06491144e894f6250b279a1e47f1287e354aaace07709933da726dab4691d8 Apr 23 16:35:41.168579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:41.168475 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ng5vn" event={"ID":"867d699c-6a2c-41bb-b885-26b9cd952983","Type":"ContainerStarted","Data":"7e06491144e894f6250b279a1e47f1287e354aaace07709933da726dab4691d8"} Apr 23 16:35:41.171941 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:41.171858 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bclp4" event={"ID":"2f8d3b70-fe21-4feb-a984-12133895766b","Type":"ContainerStarted","Data":"2a635568c79c438fae74531c1bba7b3f195808df3911e67db517e1861344e93e"} Apr 23 16:35:41.183428 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:41.180121 2561 generic.go:358] "Generic (PLEG): container finished" podID="3b9f6eca-1a5a-42f3-acf5-01a9d352a780" containerID="3738419c4373065bbd87d7c109bc925e70cd881b96e6f99849d66b0322217c66" exitCode=0 Apr 23 16:35:41.183428 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:41.180176 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerDied","Data":"3738419c4373065bbd87d7c109bc925e70cd881b96e6f99849d66b0322217c66"} Apr 23 16:35:42.189760 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:42.189493 2561 generic.go:358] "Generic (PLEG): container finished" podID="3b9f6eca-1a5a-42f3-acf5-01a9d352a780" containerID="181ebbf03f80fbba4ff9276af3109454efc9ce64dd4ab5cdfbffc56ba472fc95" exitCode=0 Apr 23 16:35:42.189760 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:42.189651 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerDied","Data":"181ebbf03f80fbba4ff9276af3109454efc9ce64dd4ab5cdfbffc56ba472fc95"} Apr 23 16:35:42.595160 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:42.595122 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:42.595335 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:42.595198 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:42.595335 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:42.595229 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:42.595458 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:42.595351 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:42.595458 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:42.595408 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls podName:0f0e9f3d-ecd0-4e57-8ef1-447361404429 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:46.595390152 +0000 UTC m=+41.148824319 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls") pod "dns-default-cdjpw" (UID: "0f0e9f3d-ecd0-4e57-8ef1-447361404429") : secret "dns-default-metrics-tls" not found Apr 23 16:35:42.595813 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:42.595793 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:42.595921 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:42.595848 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert podName:c2a76a12-982a-438c-837c-0e7665a6f46c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:46.595832583 +0000 UTC m=+41.149266748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert") pod "ingress-canary-49hdt" (UID: "c2a76a12-982a-438c-837c-0e7665a6f46c") : secret "canary-serving-cert" not found Apr 23 16:35:42.595988 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:42.595919 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:42.595988 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:42.595930 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f59db74c9-f44nw: secret "image-registry-tls" not found Apr 23 16:35:42.595988 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:42.595968 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls podName:9effa94c-05cd-4a19-9dac-bcfc8c8f18f4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:46.595955544 +0000 UTC m=+41.149389694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls") pod "image-registry-6f59db74c9-f44nw" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4") : secret "image-registry-tls" not found Apr 23 16:35:46.629693 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:46.629652 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:46.629704 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:46.629771 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:46.629848 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:46.629886 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f59db74c9-f44nw: secret "image-registry-tls" not found Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:46.629871 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:46.629908 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:46.629953 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls podName:9effa94c-05cd-4a19-9dac-bcfc8c8f18f4 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:54.629933227 +0000 UTC m=+49.183367377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls") pod "image-registry-6f59db74c9-f44nw" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4") : secret "image-registry-tls" not found Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:46.629971 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls podName:0f0e9f3d-ecd0-4e57-8ef1-447361404429 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:54.629961924 +0000 UTC m=+49.183396077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls") pod "dns-default-cdjpw" (UID: "0f0e9f3d-ecd0-4e57-8ef1-447361404429") : secret "dns-default-metrics-tls" not found Apr 23 16:35:46.630129 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:46.629986 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert podName:c2a76a12-982a-438c-837c-0e7665a6f46c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:54.629977304 +0000 UTC m=+49.183411455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert") pod "ingress-canary-49hdt" (UID: "c2a76a12-982a-438c-837c-0e7665a6f46c") : secret "canary-serving-cert" not found Apr 23 16:35:50.207506 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.207475 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" event={"ID":"304e7398-03f0-41a8-8a0f-43ae08b2760a","Type":"ContainerStarted","Data":"02757d28b6188618161824c4a5f00a155bea325ac5aa579244df974890fd3339"} Apr 23 16:35:50.208012 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.207719 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:50.208980 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.208910 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ng5vn" event={"ID":"867d699c-6a2c-41bb-b885-26b9cd952983","Type":"ContainerStarted","Data":"0a2a716f10465e8acbf00b73bcf0106c890e007219bcc4365d4be0960d9f7e32"} Apr 23 16:35:50.209404 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.209383 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:35:50.210190 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.210170 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bclp4" event={"ID":"2f8d3b70-fe21-4feb-a984-12133895766b","Type":"ContainerStarted","Data":"d5e0890e06dca400781f7bb3a047c9ca505d0b67b6f11e940bd998a15e888de6"} Apr 23 16:35:50.210263 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.210206 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:35:50.215222 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.215199 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" event={"ID":"3b9f6eca-1a5a-42f3-acf5-01a9d352a780","Type":"ContainerStarted","Data":"9ab531c4543db5e5eb05cde76a55e54a76ec93b6eca26871b2e27b707f738bd3"} Apr 23 16:35:50.216387 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.216370 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" event={"ID":"704a1757-9f37-46f2-bf18-a099b1493b45","Type":"ContainerStarted","Data":"82eadb08b1aaf2f128caf5a3957a059dcbd11ff2e37cd191b30dd76bbe73235b"} Apr 23 16:35:50.217458 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.217441 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" event={"ID":"04f18820-a137-4fa2-804e-b8a6a7bf9eb4","Type":"ContainerStarted","Data":"f94df18574946f89fa68e7779ca063d74240313f5b04331fbd14a39ad0357b4b"} Apr 23 16:35:50.232984 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.232940 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" podStartSLOduration=1.533571094 podStartE2EDuration="11.23292931s" podCreationTimestamp="2026-04-23 16:35:39 +0000 UTC" firstStartedPulling="2026-04-23 16:35:40.025976262 +0000 UTC m=+34.579410410" lastFinishedPulling="2026-04-23 16:35:49.725334465 +0000 UTC m=+44.278768626" observedRunningTime="2026-04-23 16:35:50.23223402 +0000 UTC m=+44.785668190" watchObservedRunningTime="2026-04-23 16:35:50.23292931 +0000 UTC m=+44.786363478" Apr 23 16:35:50.253083 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.253042 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ng5vn" podStartSLOduration=33.527345054 podStartE2EDuration="42.253029224s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:41.015601398 +0000 UTC m=+35.569035550" lastFinishedPulling="2026-04-23 16:35:49.741285555 +0000 UTC m=+44.294719720" observedRunningTime="2026-04-23 16:35:50.253019901 +0000 UTC m=+44.806454072" watchObservedRunningTime="2026-04-23 16:35:50.253029224 +0000 UTC m=+44.806463391" Apr 23 16:35:50.276449 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.276415 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" podStartSLOduration=1.5776790429999998 podStartE2EDuration="11.276407778s" podCreationTimestamp="2026-04-23 16:35:39 +0000 UTC" firstStartedPulling="2026-04-23 16:35:40.026142473 +0000 UTC m=+34.579576627" lastFinishedPulling="2026-04-23 16:35:49.724871199 +0000 UTC m=+44.278305362" observedRunningTime="2026-04-23 16:35:50.27551556 +0000 UTC m=+44.828949730" watchObservedRunningTime="2026-04-23 16:35:50.276407778 +0000 UTC m=+44.829841947" Apr 23 16:35:50.318517 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.318485 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6tcrq" podStartSLOduration=13.059192107 podStartE2EDuration="44.318475138s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:08.791758027 +0000 UTC m=+3.345192173" lastFinishedPulling="2026-04-23 16:35:40.05104104 +0000 UTC m=+34.604475204" observedRunningTime="2026-04-23 16:35:50.316935667 +0000 UTC m=+44.870369835" watchObservedRunningTime="2026-04-23 16:35:50.318475138 +0000 UTC m=+44.871909306" Apr 23 16:35:50.337011 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:50.336962 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bclp4" podStartSLOduration=34.782182261 podStartE2EDuration="44.33695354s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:35:40.170563647 +0000 UTC m=+34.723997795" lastFinishedPulling="2026-04-23 16:35:49.725334915 +0000 UTC m=+44.278769074" observedRunningTime="2026-04-23 16:35:50.336661124 +0000 UTC m=+44.890095293" watchObservedRunningTime="2026-04-23 16:35:50.33695354 +0000 UTC m=+44.890387701" Apr 23 16:35:53.226302 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:53.226268 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" event={"ID":"04f18820-a137-4fa2-804e-b8a6a7bf9eb4","Type":"ContainerStarted","Data":"ee852c73d67398dbda2e71bb0f47f9202c21896b1b760221b9ddf95374d3448b"} Apr 23 16:35:53.226302 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:53.226307 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" event={"ID":"04f18820-a137-4fa2-804e-b8a6a7bf9eb4","Type":"ContainerStarted","Data":"9118f978060d33e12625a03796f0cf94f80fd90b56d85955be33877c6be5b1f4"} Apr 23 16:35:54.691689 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:54.691657 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:54.691713 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:35:54.691764 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:54.691803 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:54.691843 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:54.691854 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f59db74c9-f44nw: secret "image-registry-tls" not found Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:54.691895 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls podName:0f0e9f3d-ecd0-4e57-8ef1-447361404429 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:10.691853889 +0000 UTC m=+65.245288046 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls") pod "dns-default-cdjpw" (UID: "0f0e9f3d-ecd0-4e57-8ef1-447361404429") : secret "dns-default-metrics-tls" not found Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:54.691907 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:54.691913 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls podName:9effa94c-05cd-4a19-9dac-bcfc8c8f18f4 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:10.691902618 +0000 UTC m=+65.245336765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls") pod "image-registry-6f59db74c9-f44nw" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4") : secret "image-registry-tls" not found Apr 23 16:35:54.692141 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:35:54.691984 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert podName:c2a76a12-982a-438c-837c-0e7665a6f46c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:10.691966315 +0000 UTC m=+65.245400462 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert") pod "ingress-canary-49hdt" (UID: "c2a76a12-982a-438c-837c-0e7665a6f46c") : secret "canary-serving-cert" not found Apr 23 16:36:04.656585 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:04.656557 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhhzs" Apr 23 16:36:04.691482 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:04.691436 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" podStartSLOduration=13.382634437 podStartE2EDuration="25.691424611s" podCreationTimestamp="2026-04-23 16:35:39 +0000 UTC" firstStartedPulling="2026-04-23 16:35:40.026138934 +0000 UTC m=+34.579573085" lastFinishedPulling="2026-04-23 16:35:52.334929109 +0000 UTC m=+46.888363259" observedRunningTime="2026-04-23 16:35:53.274765275 +0000 UTC m=+47.828199443" watchObservedRunningTime="2026-04-23 16:36:04.691424611 +0000 UTC m=+59.244858779" Apr 23 16:36:10.705245 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:10.705206 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:10.705265 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:10.705293 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:10.705358 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:10.705381 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:10.705429 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert podName:c2a76a12-982a-438c-837c-0e7665a6f46c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:42.70540952 +0000 UTC m=+97.258843677 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert") pod "ingress-canary-49hdt" (UID: "c2a76a12-982a-438c-837c-0e7665a6f46c") : secret "canary-serving-cert" not found Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:10.705443 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls podName:0f0e9f3d-ecd0-4e57-8ef1-447361404429 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:42.705437024 +0000 UTC m=+97.258871174 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls") pod "dns-default-cdjpw" (UID: "0f0e9f3d-ecd0-4e57-8ef1-447361404429") : secret "dns-default-metrics-tls" not found Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:10.705448 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:10.705468 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f59db74c9-f44nw: secret "image-registry-tls" not found Apr 23 16:36:10.705766 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:10.705546 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls podName:9effa94c-05cd-4a19-9dac-bcfc8c8f18f4 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:42.70552768 +0000 UTC m=+97.258961827 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls") pod "image-registry-6f59db74c9-f44nw" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4") : secret "image-registry-tls" not found Apr 23 16:36:11.814570 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:11.814532 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:36:11.815087 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:11.814713 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:36:11.815087 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:11.814796 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:15.814775498 +0000 UTC m=+130.368209664 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : secret "metrics-daemon-secret" not found Apr 23 16:36:21.223178 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:21.223149 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bclp4" Apr 23 16:36:42.722982 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:42.722946 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:42.723008 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:36:42.723036 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:42.723113 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:42.723151 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:42.723165 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:42.723193 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert podName:c2a76a12-982a-438c-837c-0e7665a6f46c nodeName:}" failed. No retries permitted until 2026-04-23 16:37:46.723176968 +0000 UTC m=+161.276611116 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert") pod "ingress-canary-49hdt" (UID: "c2a76a12-982a-438c-837c-0e7665a6f46c") : secret "canary-serving-cert" not found Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:42.723226 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls podName:0f0e9f3d-ecd0-4e57-8ef1-447361404429 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:46.723209139 +0000 UTC m=+161.276643300 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls") pod "dns-default-cdjpw" (UID: "0f0e9f3d-ecd0-4e57-8ef1-447361404429") : secret "dns-default-metrics-tls" not found Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:42.723167 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f59db74c9-f44nw: secret "image-registry-tls" not found Apr 23 16:36:42.723424 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:36:42.723274 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls podName:9effa94c-05cd-4a19-9dac-bcfc8c8f18f4 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:46.723264271 +0000 UTC m=+161.276698430 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls") pod "image-registry-6f59db74c9-f44nw" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4") : secret "image-registry-tls" not found Apr 23 16:37:15.857851 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:15.857806 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:37:15.858308 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:37:15.857973 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:37:15.858308 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:37:15.858051 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs podName:5c564efe-4a26-4498-9f97-d71703d0aa18 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:17.858034779 +0000 UTC m=+252.411468926 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs") pod "network-metrics-daemon-rg2cr" (UID: "5c564efe-4a26-4498-9f97-d71703d0aa18") : secret "metrics-daemon-secret" not found Apr 23 16:37:19.991832 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:19.991802 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-476qf_842de44e-cb5c-472a-9cf5-7d48346188d8/dns-node-resolver/0.log" Apr 23 16:37:20.992587 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:20.992563 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7wntm_9fd1b1de-6101-4e7d-a523-325e848a740a/node-ca/0.log" Apr 23 16:37:41.223961 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.223933 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ps65z"] Apr 23 16:37:41.226759 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.226745 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.240183 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.240159 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vln\" (UniqueName: \"kubernetes.io/projected/148ee375-144e-4112-aa97-25371781944e-kube-api-access-x9vln\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.240288 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.240191 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/148ee375-144e-4112-aa97-25371781944e-crio-socket\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.240288 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.240225 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/148ee375-144e-4112-aa97-25371781944e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.240354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.240289 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/148ee375-144e-4112-aa97-25371781944e-data-volume\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.240354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.240316 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/148ee375-144e-4112-aa97-25371781944e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.242491 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.242472 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:37:41.242590 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.242575 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:37:41.242624 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.242610 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:37:41.242765 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.242747 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nnwmz\"" Apr 23 16:37:41.242850 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.242838 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:37:41.268109 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.268087 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ps65z"] Apr 23 16:37:41.341225 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.341200 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vln\" (UniqueName: \"kubernetes.io/projected/148ee375-144e-4112-aa97-25371781944e-kube-api-access-x9vln\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.341363 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.341235 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/148ee375-144e-4112-aa97-25371781944e-crio-socket\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.341363 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.341282 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/148ee375-144e-4112-aa97-25371781944e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.341363 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.341309 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/148ee375-144e-4112-aa97-25371781944e-data-volume\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.341363 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.341331 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/148ee375-144e-4112-aa97-25371781944e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.341496 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.341403 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/148ee375-144e-4112-aa97-25371781944e-crio-socket\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.344437 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.344416 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/148ee375-144e-4112-aa97-25371781944e-data-volume\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.344501 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.344456 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/148ee375-144e-4112-aa97-25371781944e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.345137 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.345121 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/148ee375-144e-4112-aa97-25371781944e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.360468 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.360439 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vln\" (UniqueName: \"kubernetes.io/projected/148ee375-144e-4112-aa97-25371781944e-kube-api-access-x9vln\") pod \"insights-runtime-extractor-ps65z\" (UID: \"148ee375-144e-4112-aa97-25371781944e\") " pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.535417 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.535396 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ps65z" Apr 23 16:37:41.660168 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:41.660133 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ps65z"] Apr 23 16:37:41.670121 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:37:41.670094 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod148ee375_144e_4112_aa97_25371781944e.slice/crio-7ad4642870e08a237126e166241a61269bad56399a52b5aa3b6d283c8adc89b6 WatchSource:0}: Error finding container 7ad4642870e08a237126e166241a61269bad56399a52b5aa3b6d283c8adc89b6: Status 404 returned error can't find the container with id 7ad4642870e08a237126e166241a61269bad56399a52b5aa3b6d283c8adc89b6 Apr 23 16:37:41.866660 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:37:41.866579 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" podUID="9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" Apr 23 16:37:41.877988 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:37:41.877965 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-49hdt" podUID="c2a76a12-982a-438c-837c-0e7665a6f46c" Apr 23 16:37:41.891239 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:37:41.891216 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cdjpw" podUID="0f0e9f3d-ecd0-4e57-8ef1-447361404429" Apr 23 16:37:42.046491 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:37:42.046458 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rg2cr" podUID="5c564efe-4a26-4498-9f97-d71703d0aa18" Apr 23 16:37:42.475939 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:42.475899 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ps65z" event={"ID":"148ee375-144e-4112-aa97-25371781944e","Type":"ContainerStarted","Data":"a9a409bef740f0237df64c05e9e4dac763847675ad78587836e6b47579a4c9f7"} Apr 23 16:37:42.475939 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:42.475930 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:37:42.476253 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:42.475943 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ps65z" event={"ID":"148ee375-144e-4112-aa97-25371781944e","Type":"ContainerStarted","Data":"d9075998a43c989dd8865af5b41463b0f987e247410de14ee3e80217831ec267"} Apr 23 16:37:42.476253 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:42.475956 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ps65z" event={"ID":"148ee375-144e-4112-aa97-25371781944e","Type":"ContainerStarted","Data":"7ad4642870e08a237126e166241a61269bad56399a52b5aa3b6d283c8adc89b6"} Apr 23 16:37:42.476253 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:42.475942 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:37:42.476253 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:42.475931 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdjpw" Apr 23 16:37:44.482031 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:44.481998 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ps65z" event={"ID":"148ee375-144e-4112-aa97-25371781944e","Type":"ContainerStarted","Data":"d1e5af42e190588d358c33499516c472a2173864420bbd4275a8282d14ca53c6"} Apr 23 16:37:44.512069 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:44.512017 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ps65z" podStartSLOduration=1.507461183 podStartE2EDuration="3.51200189s" podCreationTimestamp="2026-04-23 16:37:41 +0000 UTC" firstStartedPulling="2026-04-23 16:37:41.730855466 +0000 UTC m=+156.284289614" lastFinishedPulling="2026-04-23 16:37:43.735396158 +0000 UTC m=+158.288830321" observedRunningTime="2026-04-23 16:37:44.511327619 +0000 UTC m=+159.064761788" watchObservedRunningTime="2026-04-23 16:37:44.51200189 +0000 UTC m=+159.065436044" Apr 23 16:37:46.781741 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.781694 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:37:46.782240 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.781761 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:37:46.782240 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.781789 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:37:46.784060 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.784037 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f0e9f3d-ecd0-4e57-8ef1-447361404429-metrics-tls\") pod \"dns-default-cdjpw\" (UID: \"0f0e9f3d-ecd0-4e57-8ef1-447361404429\") " pod="openshift-dns/dns-default-cdjpw" Apr 23 16:37:46.784138 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.784121 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2a76a12-982a-438c-837c-0e7665a6f46c-cert\") pod \"ingress-canary-49hdt\" (UID: \"c2a76a12-982a-438c-837c-0e7665a6f46c\") " pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:37:46.784219 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.784204 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"image-registry-6f59db74c9-f44nw\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:37:46.979836 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.979806 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f8556\"" Apr 23 16:37:46.980926 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.980907 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jmk6t\"" Apr 23 16:37:46.981003 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.980946 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s982g\"" Apr 23 16:37:46.987818 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.987799 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-49hdt" Apr 23 16:37:46.987909 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.987820 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:37:46.987909 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:46.987889 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdjpw" Apr 23 16:37:47.140728 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.140662 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f59db74c9-f44nw"] Apr 23 16:37:47.143120 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:37:47.143094 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9effa94c_05cd_4a19_9dac_bcfc8c8f18f4.slice/crio-cd2c483f11ce2c4fd8423c7b38d0e002169ac094f440eb12b3de5d6295e93625 WatchSource:0}: Error finding container cd2c483f11ce2c4fd8423c7b38d0e002169ac094f440eb12b3de5d6295e93625: Status 404 returned error can't find the container with id cd2c483f11ce2c4fd8423c7b38d0e002169ac094f440eb12b3de5d6295e93625 Apr 23 16:37:47.156306 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.156288 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdjpw"] Apr 23 16:37:47.157974 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:37:47.157950 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f0e9f3d_ecd0_4e57_8ef1_447361404429.slice/crio-6759f73bdb1e87ae98173ece3089ead97f198e0c2caa01d4ea0cc8888c0fae0c WatchSource:0}: Error finding container 6759f73bdb1e87ae98173ece3089ead97f198e0c2caa01d4ea0cc8888c0fae0c: Status 404 returned error can't find the container with id 6759f73bdb1e87ae98173ece3089ead97f198e0c2caa01d4ea0cc8888c0fae0c Apr 23 16:37:47.165624 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.165604 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-49hdt"] Apr 23 16:37:47.176849 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:37:47.176823 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a76a12_982a_438c_837c_0e7665a6f46c.slice/crio-774a58a696695b758727668c35975c68c2f5a778470a1cd3f003c289eda79126 WatchSource:0}: Error finding container 774a58a696695b758727668c35975c68c2f5a778470a1cd3f003c289eda79126: Status 404 returned error can't find the container with id 774a58a696695b758727668c35975c68c2f5a778470a1cd3f003c289eda79126 Apr 23 16:37:47.490910 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.490821 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-49hdt" event={"ID":"c2a76a12-982a-438c-837c-0e7665a6f46c","Type":"ContainerStarted","Data":"774a58a696695b758727668c35975c68c2f5a778470a1cd3f003c289eda79126"} Apr 23 16:37:47.492145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.492120 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" event={"ID":"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4","Type":"ContainerStarted","Data":"a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063"} Apr 23 16:37:47.492145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.492147 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" event={"ID":"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4","Type":"ContainerStarted","Data":"cd2c483f11ce2c4fd8423c7b38d0e002169ac094f440eb12b3de5d6295e93625"} Apr 23 16:37:47.492342 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.492221 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:37:47.493166 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.493142 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdjpw" event={"ID":"0f0e9f3d-ecd0-4e57-8ef1-447361404429","Type":"ContainerStarted","Data":"6759f73bdb1e87ae98173ece3089ead97f198e0c2caa01d4ea0cc8888c0fae0c"} Apr 23 16:37:47.524906 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:47.524849 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" podStartSLOduration=161.524836398 podStartE2EDuration="2m41.524836398s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:47.524395033 +0000 UTC m=+162.077829203" watchObservedRunningTime="2026-04-23 16:37:47.524836398 +0000 UTC m=+162.078270605" Apr 23 16:37:49.500163 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:49.500096 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-49hdt" event={"ID":"c2a76a12-982a-438c-837c-0e7665a6f46c","Type":"ContainerStarted","Data":"356a3b1e5e82840ae8a02352d861f69c77f07854338e1fab949fd2f143d802f0"} Apr 23 16:37:49.501521 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:49.501495 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdjpw" event={"ID":"0f0e9f3d-ecd0-4e57-8ef1-447361404429","Type":"ContainerStarted","Data":"868bc7ba284247635a8f4297ca4a3b526b745662893c42ff423712e29f94ddbd"} Apr 23 16:37:49.501619 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:49.501527 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdjpw" event={"ID":"0f0e9f3d-ecd0-4e57-8ef1-447361404429","Type":"ContainerStarted","Data":"1450b029f7666ea861073391123baf182441f1d3459aef35d6b761134b787ed7"} Apr 23 16:37:49.501619 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:49.501610 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cdjpw" Apr 23 16:37:49.517737 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:49.517699 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-49hdt" podStartSLOduration=129.496366085 podStartE2EDuration="2m11.517687498s" podCreationTimestamp="2026-04-23 16:35:38 +0000 UTC" firstStartedPulling="2026-04-23 16:37:47.178597066 +0000 UTC m=+161.732031214" lastFinishedPulling="2026-04-23 16:37:49.19991848 +0000 UTC m=+163.753352627" observedRunningTime="2026-04-23 16:37:49.51694951 +0000 UTC m=+164.070383682" watchObservedRunningTime="2026-04-23 16:37:49.517687498 +0000 UTC m=+164.071121667" Apr 23 16:37:49.543493 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:49.543456 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cdjpw" podStartSLOduration=129.505964163 podStartE2EDuration="2m11.543444239s" podCreationTimestamp="2026-04-23 16:35:38 +0000 UTC" firstStartedPulling="2026-04-23 16:37:47.159560246 +0000 UTC m=+161.712994394" lastFinishedPulling="2026-04-23 16:37:49.197040322 +0000 UTC m=+163.750474470" observedRunningTime="2026-04-23 16:37:49.543305154 +0000 UTC m=+164.096739327" watchObservedRunningTime="2026-04-23 16:37:49.543444239 +0000 UTC m=+164.096878408" Apr 23 16:37:50.208433 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:50.208344 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" podUID="304e7398-03f0-41a8-8a0f-43ae08b2760a" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 23 16:37:50.505815 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:50.505738 2561 generic.go:358] "Generic (PLEG): container finished" podID="304e7398-03f0-41a8-8a0f-43ae08b2760a" containerID="02757d28b6188618161824c4a5f00a155bea325ac5aa579244df974890fd3339" exitCode=1 Apr 23 16:37:50.506230 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:50.505815 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" event={"ID":"304e7398-03f0-41a8-8a0f-43ae08b2760a","Type":"ContainerDied","Data":"02757d28b6188618161824c4a5f00a155bea325ac5aa579244df974890fd3339"} Apr 23 16:37:50.506230 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:50.506209 2561 scope.go:117] "RemoveContainer" containerID="02757d28b6188618161824c4a5f00a155bea325ac5aa579244df974890fd3339" Apr 23 16:37:50.507105 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:50.507087 2561 generic.go:358] "Generic (PLEG): container finished" podID="704a1757-9f37-46f2-bf18-a099b1493b45" containerID="82eadb08b1aaf2f128caf5a3957a059dcbd11ff2e37cd191b30dd76bbe73235b" exitCode=255 Apr 23 16:37:50.507210 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:50.507156 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" event={"ID":"704a1757-9f37-46f2-bf18-a099b1493b45","Type":"ContainerDied","Data":"82eadb08b1aaf2f128caf5a3957a059dcbd11ff2e37cd191b30dd76bbe73235b"} Apr 23 16:37:50.507596 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:50.507577 2561 scope.go:117] "RemoveContainer" containerID="82eadb08b1aaf2f128caf5a3957a059dcbd11ff2e37cd191b30dd76bbe73235b" Apr 23 16:37:51.511546 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:51.511506 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-785997c65f-bzbg8" event={"ID":"704a1757-9f37-46f2-bf18-a099b1493b45","Type":"ContainerStarted","Data":"8d51ea9c6c12169793082c7871ab20b0ab7946458d2cf843e3b08124be83cf75"} Apr 23 16:37:51.513099 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:51.513076 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" event={"ID":"304e7398-03f0-41a8-8a0f-43ae08b2760a","Type":"ContainerStarted","Data":"1b9920861efc69bb846cf88e2083f5557cad240ec4a9450800c50818e20942ad"} Apr 23 16:37:51.513325 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:51.513312 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:37:51.513840 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:51.513827 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-544dfb5797-k44w4" Apr 23 16:37:54.028033 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:54.027996 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:37:55.134130 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.134056 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h6lxb"] Apr 23 16:37:55.137024 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.137004 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.140981 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.140961 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:37:55.140981 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.140972 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:37:55.142429 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.142407 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:37:55.142531 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.142428 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:37:55.142531 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.142474 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:37:55.142531 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.142434 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vplks\"" Apr 23 16:37:55.142680 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.142421 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:37:55.247823 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.247800 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-root\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.247946 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.247828 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-wtmp\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.247946 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.247850 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-tls\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.247946 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.247902 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-metrics-client-ca\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.248046 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.247973 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-textfile\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.248046 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.247999 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxbq\" (UniqueName: \"kubernetes.io/projected/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-kube-api-access-vlxbq\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.248046 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.248037 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-sys\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.248132 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.248062 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-accelerators-collector-config\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.248132 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.248081 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349404 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349374 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxbq\" (UniqueName: \"kubernetes.io/projected/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-kube-api-access-vlxbq\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349510 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349418 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-sys\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349510 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349467 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-sys\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349578 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349524 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-accelerators-collector-config\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349578 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349553 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349643 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349588 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-root\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349643 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349607 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-wtmp\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349643 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349623 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-tls\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349771 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349660 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-metrics-client-ca\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349771 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349675 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-root\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349771 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349688 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-textfile\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349905 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349769 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-wtmp\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.349966 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.349952 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-textfile\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.350143 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.350119 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-accelerators-collector-config\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.350210 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.350182 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-metrics-client-ca\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.351911 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.351889 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-tls\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.352003 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.351918 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.359078 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.359058 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxbq\" (UniqueName: \"kubernetes.io/projected/93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a-kube-api-access-vlxbq\") pod \"node-exporter-h6lxb\" (UID: \"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a\") " pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.446012 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.445948 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h6lxb" Apr 23 16:37:55.454390 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:37:55.454368 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f3717b_c6ae_4dc7_8bfc_cc35c24bf51a.slice/crio-38fabe46c2d5ec60ff56642afdebfdd4486b08629785a50948e4b86074714fc9 WatchSource:0}: Error finding container 38fabe46c2d5ec60ff56642afdebfdd4486b08629785a50948e4b86074714fc9: Status 404 returned error can't find the container with id 38fabe46c2d5ec60ff56642afdebfdd4486b08629785a50948e4b86074714fc9 Apr 23 16:37:55.527823 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:55.527797 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h6lxb" event={"ID":"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a","Type":"ContainerStarted","Data":"38fabe46c2d5ec60ff56642afdebfdd4486b08629785a50948e4b86074714fc9"} Apr 23 16:37:56.532028 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:56.532002 2561 generic.go:358] "Generic (PLEG): container finished" podID="93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a" containerID="889604384da90dac765e952fb4b4b9c193d00b9d912f55bffa72231231d7e345" exitCode=0 Apr 23 16:37:56.532344 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:56.532055 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h6lxb" event={"ID":"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a","Type":"ContainerDied","Data":"889604384da90dac765e952fb4b4b9c193d00b9d912f55bffa72231231d7e345"} Apr 23 16:37:57.536252 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:57.536218 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h6lxb" event={"ID":"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a","Type":"ContainerStarted","Data":"7afdfe6e49dc5ba430f90ab2a6a0e5961b39d57d4a55623004932ef9e0f38f41"} Apr 23 16:37:57.536252 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:57.536253 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h6lxb" event={"ID":"93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a","Type":"ContainerStarted","Data":"eaf7914c2dfa80ccdb9f7999b22824bc3c9d64031a5679576b4932971ad19209"} Apr 23 16:37:57.559449 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:57.559402 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h6lxb" podStartSLOduration=1.838471927 podStartE2EDuration="2.559389862s" podCreationTimestamp="2026-04-23 16:37:55 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.456433636 +0000 UTC m=+170.009867786" lastFinishedPulling="2026-04-23 16:37:56.17735157 +0000 UTC m=+170.730785721" observedRunningTime="2026-04-23 16:37:57.55858664 +0000 UTC m=+172.112020810" watchObservedRunningTime="2026-04-23 16:37:57.559389862 +0000 UTC m=+172.112824031" Apr 23 16:37:59.509401 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:37:59.509371 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cdjpw" Apr 23 16:38:03.091727 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:03.091698 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f59db74c9-f44nw"] Apr 23 16:38:03.095695 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:03.095668 2561 patch_prober.go:28] interesting pod/image-registry-6f59db74c9-f44nw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 16:38:03.095824 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:03.095723 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" podUID="9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:38:13.096262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:13.096233 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:38:28.110212 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.110153 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" podUID="9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" containerName="registry" containerID="cri-o://a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063" gracePeriod=30 Apr 23 16:38:28.342642 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.342617 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:38:28.477061 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.476974 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-bound-sa-token\") pod \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " Apr 23 16:38:28.477061 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.477037 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-image-registry-private-configuration\") pod \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " Apr 23 16:38:28.477283 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.477068 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-trusted-ca\") pod \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " Apr 23 16:38:28.477283 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.477091 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-installation-pull-secrets\") pod \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " Apr 23 16:38:28.477283 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.477120 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") pod \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " Apr 23 16:38:28.477283 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.477148 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-ca-trust-extracted\") pod \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " Apr 23 16:38:28.477283 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.477178 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-certificates\") pod \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " Apr 23 16:38:28.477283 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.477208 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n5kn\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-kube-api-access-6n5kn\") pod \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\" (UID: \"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4\") " Apr 23 16:38:28.477799 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.477711 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:28.478428 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.478395 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:28.479623 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.479591 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:28.479854 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.479824 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-kube-api-access-6n5kn" (OuterVolumeSpecName: "kube-api-access-6n5kn") pod "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4"). InnerVolumeSpecName "kube-api-access-6n5kn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:28.479854 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.479839 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:28.480018 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.479828 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:28.480018 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.479914 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:28.485806 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.485778 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" (UID: "9effa94c-05cd-4a19-9dac-bcfc8c8f18f4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:38:28.577942 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.577915 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6n5kn\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-kube-api-access-6n5kn\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:38:28.577942 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.577941 2561 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-bound-sa-token\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:38:28.578085 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.577957 2561 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-image-registry-private-configuration\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:38:28.578085 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.577970 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-trusted-ca\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:38:28.578085 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.577982 2561 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-installation-pull-secrets\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:38:28.578085 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.577995 2561 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:38:28.578085 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.578007 2561 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-ca-trust-extracted\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:38:28.578085 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.578020 2561 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4-registry-certificates\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:38:28.611689 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.611663 2561 generic.go:358] "Generic (PLEG): container finished" podID="9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" containerID="a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063" exitCode=0 Apr 23 16:38:28.611830 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.611721 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" Apr 23 16:38:28.611830 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.611753 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" event={"ID":"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4","Type":"ContainerDied","Data":"a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063"} Apr 23 16:38:28.611830 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.611791 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f59db74c9-f44nw" event={"ID":"9effa94c-05cd-4a19-9dac-bcfc8c8f18f4","Type":"ContainerDied","Data":"cd2c483f11ce2c4fd8423c7b38d0e002169ac094f440eb12b3de5d6295e93625"} Apr 23 16:38:28.611830 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.611808 2561 scope.go:117] "RemoveContainer" containerID="a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063" Apr 23 16:38:28.619838 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.619823 2561 scope.go:117] "RemoveContainer" containerID="a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063" Apr 23 16:38:28.620094 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:38:28.620075 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063\": container with ID starting with a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063 not found: ID does not exist" containerID="a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063" Apr 23 16:38:28.620167 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.620104 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063"} err="failed to get container status \"a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063\": rpc error: code = NotFound desc = could not find container \"a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063\": container with ID starting with a0d1a51d5db46a848631c1433bebefebefa4374ff1c8bc679b5f7b5439d17063 not found: ID does not exist" Apr 23 16:38:28.632631 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.632607 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f59db74c9-f44nw"] Apr 23 16:38:28.636544 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:28.636524 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6f59db74c9-f44nw"] Apr 23 16:38:30.031796 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:30.031765 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" path="/var/lib/kubelet/pods/9effa94c-05cd-4a19-9dac-bcfc8c8f18f4/volumes" Apr 23 16:38:39.451142 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:39.451102 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" podUID="04f18820-a137-4fa2-804e-b8a6a7bf9eb4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:38:49.451297 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:49.451260 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" podUID="04f18820-a137-4fa2-804e-b8a6a7bf9eb4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:38:59.450714 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:59.450673 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" podUID="04f18820-a137-4fa2-804e-b8a6a7bf9eb4" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:38:59.451200 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:59.450739 2561 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" Apr 23 16:38:59.451200 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:59.451187 2561 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ee852c73d67398dbda2e71bb0f47f9202c21896b1b760221b9ddf95374d3448b"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 16:38:59.451271 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:59.451226 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" podUID="04f18820-a137-4fa2-804e-b8a6a7bf9eb4" containerName="service-proxy" containerID="cri-o://ee852c73d67398dbda2e71bb0f47f9202c21896b1b760221b9ddf95374d3448b" gracePeriod=30 Apr 23 16:38:59.689965 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:59.689938 2561 generic.go:358] "Generic (PLEG): container finished" podID="04f18820-a137-4fa2-804e-b8a6a7bf9eb4" containerID="ee852c73d67398dbda2e71bb0f47f9202c21896b1b760221b9ddf95374d3448b" exitCode=2 Apr 23 16:38:59.690066 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:59.689996 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" event={"ID":"04f18820-a137-4fa2-804e-b8a6a7bf9eb4","Type":"ContainerDied","Data":"ee852c73d67398dbda2e71bb0f47f9202c21896b1b760221b9ddf95374d3448b"} Apr 23 16:38:59.690066 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:38:59.690028 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7749dbf8b5-zr4nh" event={"ID":"04f18820-a137-4fa2-804e-b8a6a7bf9eb4","Type":"ContainerStarted","Data":"dec8f0057caf0d137b2758b5c073c813ff46719a7bf2e656348be7b61eb4032e"} Apr 23 16:39:17.925992 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:17.925952 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:39:17.928179 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:17.928154 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c564efe-4a26-4498-9f97-d71703d0aa18-metrics-certs\") pod \"network-metrics-daemon-rg2cr\" (UID: \"5c564efe-4a26-4498-9f97-d71703d0aa18\") " pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:39:18.031567 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:18.031542 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f89h4\"" Apr 23 16:39:18.039586 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:18.039569 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2cr" Apr 23 16:39:18.149971 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:18.149815 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rg2cr"] Apr 23 16:39:18.152282 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:39:18.152252 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c564efe_4a26_4498_9f97_d71703d0aa18.slice/crio-f5810438bb62a15eb4e872d3df3d1b15d98b43139126ae3f8416f21b89a2bdd5 WatchSource:0}: Error finding container f5810438bb62a15eb4e872d3df3d1b15d98b43139126ae3f8416f21b89a2bdd5: Status 404 returned error can't find the container with id f5810438bb62a15eb4e872d3df3d1b15d98b43139126ae3f8416f21b89a2bdd5 Apr 23 16:39:18.736633 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:18.736595 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg2cr" event={"ID":"5c564efe-4a26-4498-9f97-d71703d0aa18","Type":"ContainerStarted","Data":"f5810438bb62a15eb4e872d3df3d1b15d98b43139126ae3f8416f21b89a2bdd5"} Apr 23 16:39:19.742749 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:19.742683 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg2cr" event={"ID":"5c564efe-4a26-4498-9f97-d71703d0aa18","Type":"ContainerStarted","Data":"f2dea7a6ab2c09d8cd41ac66aef6679d84b9507368c4312e07208203b96fb305"} Apr 23 16:39:19.742749 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:19.742719 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg2cr" event={"ID":"5c564efe-4a26-4498-9f97-d71703d0aa18","Type":"ContainerStarted","Data":"5ef5a383c9161f79edffb75f75319b9f37d9e2d9cde70c7e745b8bb51f68685d"} Apr 23 16:39:19.763952 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:39:19.763905 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rg2cr" podStartSLOduration=252.436279295 podStartE2EDuration="4m13.763891189s" podCreationTimestamp="2026-04-23 16:35:06 +0000 UTC" firstStartedPulling="2026-04-23 16:39:18.154027994 +0000 UTC m=+252.707462142" lastFinishedPulling="2026-04-23 16:39:19.481639889 +0000 UTC m=+254.035074036" observedRunningTime="2026-04-23 16:39:19.762498149 +0000 UTC m=+254.315932318" watchObservedRunningTime="2026-04-23 16:39:19.763891189 +0000 UTC m=+254.317325348" Apr 23 16:40:05.922470 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:40:05.922446 2561 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:42:08.530414 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.530384 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-j6mc8"] Apr 23 16:42:08.530844 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.530637 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" containerName="registry" Apr 23 16:42:08.530844 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.530648 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" containerName="registry" Apr 23 16:42:08.530844 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.530689 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9effa94c-05cd-4a19-9dac-bcfc8c8f18f4" containerName="registry" Apr 23 16:42:08.533427 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.533407 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:08.536987 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.536965 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 16:42:08.537265 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.537249 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-6zjm4\"" Apr 23 16:42:08.538142 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.538122 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:42:08.538248 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.538122 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:42:08.545592 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.545571 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-j6mc8"] Apr 23 16:42:08.632222 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.632193 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnfls\" (UniqueName: \"kubernetes.io/projected/5989610c-a62c-4e74-b9da-fe3201ddb606-kube-api-access-xnfls\") pod \"kserve-controller-manager-5b898d7b9d-j6mc8\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:08.632222 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.632224 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert\") pod \"kserve-controller-manager-5b898d7b9d-j6mc8\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:08.732915 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.732871 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnfls\" (UniqueName: \"kubernetes.io/projected/5989610c-a62c-4e74-b9da-fe3201ddb606-kube-api-access-xnfls\") pod \"kserve-controller-manager-5b898d7b9d-j6mc8\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:08.733106 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.732925 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert\") pod \"kserve-controller-manager-5b898d7b9d-j6mc8\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:08.733106 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:42:08.733053 2561 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 23 16:42:08.733220 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:42:08.733129 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert podName:5989610c-a62c-4e74-b9da-fe3201ddb606 nodeName:}" failed. No retries permitted until 2026-04-23 16:42:09.233106641 +0000 UTC m=+423.786540791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert") pod "kserve-controller-manager-5b898d7b9d-j6mc8" (UID: "5989610c-a62c-4e74-b9da-fe3201ddb606") : secret "kserve-webhook-server-cert" not found Apr 23 16:42:08.749296 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:08.749272 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnfls\" (UniqueName: \"kubernetes.io/projected/5989610c-a62c-4e74-b9da-fe3201ddb606-kube-api-access-xnfls\") pod \"kserve-controller-manager-5b898d7b9d-j6mc8\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:09.236180 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:09.236135 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert\") pod \"kserve-controller-manager-5b898d7b9d-j6mc8\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:09.238447 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:09.238416 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert\") pod \"kserve-controller-manager-5b898d7b9d-j6mc8\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:09.443424 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:09.443386 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:09.560723 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:09.560692 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-j6mc8"] Apr 23 16:42:09.563406 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:42:09.563381 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5989610c_a62c_4e74_b9da_fe3201ddb606.slice/crio-995cfe5bfd10e2218b3088ab2c0446408563d820a3441ccbacfb9e994e9c7246 WatchSource:0}: Error finding container 995cfe5bfd10e2218b3088ab2c0446408563d820a3441ccbacfb9e994e9c7246: Status 404 returned error can't find the container with id 995cfe5bfd10e2218b3088ab2c0446408563d820a3441ccbacfb9e994e9c7246 Apr 23 16:42:09.564630 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:09.564614 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:42:10.162278 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:10.162236 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" event={"ID":"5989610c-a62c-4e74-b9da-fe3201ddb606","Type":"ContainerStarted","Data":"995cfe5bfd10e2218b3088ab2c0446408563d820a3441ccbacfb9e994e9c7246"} Apr 23 16:42:12.168947 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:12.168848 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" event={"ID":"5989610c-a62c-4e74-b9da-fe3201ddb606","Type":"ContainerStarted","Data":"a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0"} Apr 23 16:42:12.169286 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:12.169005 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:43.176668 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:43.176638 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:43.195580 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:43.195536 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" podStartSLOduration=32.885027836 podStartE2EDuration="35.195523925s" podCreationTimestamp="2026-04-23 16:42:08 +0000 UTC" firstStartedPulling="2026-04-23 16:42:09.564730457 +0000 UTC m=+424.118164608" lastFinishedPulling="2026-04-23 16:42:11.875226548 +0000 UTC m=+426.428660697" observedRunningTime="2026-04-23 16:42:12.19911564 +0000 UTC m=+426.752549809" watchObservedRunningTime="2026-04-23 16:42:43.195523925 +0000 UTC m=+457.748958096" Apr 23 16:42:45.538570 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.538537 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-j6mc8"] Apr 23 16:42:45.538938 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.538733 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" podUID="5989610c-a62c-4e74-b9da-fe3201ddb606" containerName="manager" containerID="cri-o://a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0" gracePeriod=10 Apr 23 16:42:45.572228 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.572201 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-l9fp8"] Apr 23 16:42:45.577282 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.577259 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:45.586206 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.586184 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-l9fp8"] Apr 23 16:42:45.666942 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.666913 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d241cedd-49d9-4bbc-88a1-81beb91b298d-cert\") pod \"kserve-controller-manager-5b898d7b9d-l9fp8\" (UID: \"d241cedd-49d9-4bbc-88a1-81beb91b298d\") " pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:45.667035 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.666971 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdp8l\" (UniqueName: \"kubernetes.io/projected/d241cedd-49d9-4bbc-88a1-81beb91b298d-kube-api-access-cdp8l\") pod \"kserve-controller-manager-5b898d7b9d-l9fp8\" (UID: \"d241cedd-49d9-4bbc-88a1-81beb91b298d\") " pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:45.767207 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.767187 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:45.767331 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.767288 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdp8l\" (UniqueName: \"kubernetes.io/projected/d241cedd-49d9-4bbc-88a1-81beb91b298d-kube-api-access-cdp8l\") pod \"kserve-controller-manager-5b898d7b9d-l9fp8\" (UID: \"d241cedd-49d9-4bbc-88a1-81beb91b298d\") " pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:45.767395 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.767339 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d241cedd-49d9-4bbc-88a1-81beb91b298d-cert\") pod \"kserve-controller-manager-5b898d7b9d-l9fp8\" (UID: \"d241cedd-49d9-4bbc-88a1-81beb91b298d\") " pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:45.769488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.769471 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d241cedd-49d9-4bbc-88a1-81beb91b298d-cert\") pod \"kserve-controller-manager-5b898d7b9d-l9fp8\" (UID: \"d241cedd-49d9-4bbc-88a1-81beb91b298d\") " pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:45.775453 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.775434 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdp8l\" (UniqueName: \"kubernetes.io/projected/d241cedd-49d9-4bbc-88a1-81beb91b298d-kube-api-access-cdp8l\") pod \"kserve-controller-manager-5b898d7b9d-l9fp8\" (UID: \"d241cedd-49d9-4bbc-88a1-81beb91b298d\") " pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:45.868271 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.868207 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert\") pod \"5989610c-a62c-4e74-b9da-fe3201ddb606\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " Apr 23 16:42:45.868271 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.868251 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnfls\" (UniqueName: \"kubernetes.io/projected/5989610c-a62c-4e74-b9da-fe3201ddb606-kube-api-access-xnfls\") pod \"5989610c-a62c-4e74-b9da-fe3201ddb606\" (UID: \"5989610c-a62c-4e74-b9da-fe3201ddb606\") " Apr 23 16:42:45.870160 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.870139 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert" (OuterVolumeSpecName: "cert") pod "5989610c-a62c-4e74-b9da-fe3201ddb606" (UID: "5989610c-a62c-4e74-b9da-fe3201ddb606"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:42:45.870262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.870184 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5989610c-a62c-4e74-b9da-fe3201ddb606-kube-api-access-xnfls" (OuterVolumeSpecName: "kube-api-access-xnfls") pod "5989610c-a62c-4e74-b9da-fe3201ddb606" (UID: "5989610c-a62c-4e74-b9da-fe3201ddb606"). InnerVolumeSpecName "kube-api-access-xnfls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:42:45.909634 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.909614 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:45.969077 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.969051 2561 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5989610c-a62c-4e74-b9da-fe3201ddb606-cert\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:42:45.969077 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:45.969077 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnfls\" (UniqueName: \"kubernetes.io/projected/5989610c-a62c-4e74-b9da-fe3201ddb606-kube-api-access-xnfls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:42:46.020138 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.020105 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-l9fp8"] Apr 23 16:42:46.022688 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:42:46.022665 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd241cedd_49d9_4bbc_88a1_81beb91b298d.slice/crio-f0ed72314831e8c402c9dfb33049920993811d65f4a7f530c648dd2681fe154e WatchSource:0}: Error finding container f0ed72314831e8c402c9dfb33049920993811d65f4a7f530c648dd2681fe154e: Status 404 returned error can't find the container with id f0ed72314831e8c402c9dfb33049920993811d65f4a7f530c648dd2681fe154e Apr 23 16:42:46.253052 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.252976 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" event={"ID":"d241cedd-49d9-4bbc-88a1-81beb91b298d","Type":"ContainerStarted","Data":"f0ed72314831e8c402c9dfb33049920993811d65f4a7f530c648dd2681fe154e"} Apr 23 16:42:46.253860 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.253833 2561 generic.go:358] "Generic (PLEG): container finished" podID="5989610c-a62c-4e74-b9da-fe3201ddb606" containerID="a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0" exitCode=0 Apr 23 16:42:46.253995 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.253914 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" event={"ID":"5989610c-a62c-4e74-b9da-fe3201ddb606","Type":"ContainerDied","Data":"a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0"} Apr 23 16:42:46.253995 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.253927 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" Apr 23 16:42:46.253995 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.253942 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-j6mc8" event={"ID":"5989610c-a62c-4e74-b9da-fe3201ddb606","Type":"ContainerDied","Data":"995cfe5bfd10e2218b3088ab2c0446408563d820a3441ccbacfb9e994e9c7246"} Apr 23 16:42:46.253995 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.253957 2561 scope.go:117] "RemoveContainer" containerID="a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0" Apr 23 16:42:46.261163 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.261145 2561 scope.go:117] "RemoveContainer" containerID="a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0" Apr 23 16:42:46.261449 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:42:46.261421 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0\": container with ID starting with a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0 not found: ID does not exist" containerID="a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0" Apr 23 16:42:46.261539 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.261453 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0"} err="failed to get container status \"a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0\": rpc error: code = NotFound desc = could not find container \"a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0\": container with ID starting with a9726cbdba5d639d4a0271026727f552f7581cd0fa770259336d08413b2311e0 not found: ID does not exist" Apr 23 16:42:46.273729 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.273706 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-j6mc8"] Apr 23 16:42:46.277277 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:46.277256 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-5b898d7b9d-j6mc8"] Apr 23 16:42:47.257912 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:47.257852 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" event={"ID":"d241cedd-49d9-4bbc-88a1-81beb91b298d","Type":"ContainerStarted","Data":"60e1db69fb1efa4d6336b27257b86d4e060a37b225b9aa2bf34b47d764a7ba1c"} Apr 23 16:42:47.258345 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:47.258064 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:42:47.279599 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:47.279553 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" podStartSLOduration=1.903325298 podStartE2EDuration="2.279541858s" podCreationTimestamp="2026-04-23 16:42:45 +0000 UTC" firstStartedPulling="2026-04-23 16:42:46.023932835 +0000 UTC m=+460.577366982" lastFinishedPulling="2026-04-23 16:42:46.400149382 +0000 UTC m=+460.953583542" observedRunningTime="2026-04-23 16:42:47.277642738 +0000 UTC m=+461.831076898" watchObservedRunningTime="2026-04-23 16:42:47.279541858 +0000 UTC m=+461.832976027" Apr 23 16:42:48.031337 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:42:48.031303 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5989610c-a62c-4e74-b9da-fe3201ddb606" path="/var/lib/kubelet/pods/5989610c-a62c-4e74-b9da-fe3201ddb606/volumes" Apr 23 16:43:18.266367 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:18.266336 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-5b898d7b9d-l9fp8" Apr 23 16:43:25.009768 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.009733 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-mflqr"] Apr 23 16:43:25.010236 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.009968 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5989610c-a62c-4e74-b9da-fe3201ddb606" containerName="manager" Apr 23 16:43:25.010236 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.009979 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="5989610c-a62c-4e74-b9da-fe3201ddb606" containerName="manager" Apr 23 16:43:25.010236 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.010024 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="5989610c-a62c-4e74-b9da-fe3201ddb606" containerName="manager" Apr 23 16:43:25.017544 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.017524 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:25.021233 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.021009 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 16:43:25.021233 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.021011 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-vh68j\"" Apr 23 16:43:25.021396 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.021300 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mflqr"] Apr 23 16:43:25.124754 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.124725 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8542bf7-709a-4750-9a1a-1f87ef41d2cd-tls-certs\") pod \"model-serving-api-86f7b4b499-mflqr\" (UID: \"e8542bf7-709a-4750-9a1a-1f87ef41d2cd\") " pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:25.124914 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.124766 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrc97\" (UniqueName: \"kubernetes.io/projected/e8542bf7-709a-4750-9a1a-1f87ef41d2cd-kube-api-access-jrc97\") pod \"model-serving-api-86f7b4b499-mflqr\" (UID: \"e8542bf7-709a-4750-9a1a-1f87ef41d2cd\") " pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:25.225528 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.225498 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8542bf7-709a-4750-9a1a-1f87ef41d2cd-tls-certs\") pod \"model-serving-api-86f7b4b499-mflqr\" (UID: \"e8542bf7-709a-4750-9a1a-1f87ef41d2cd\") " pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:25.225676 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.225534 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrc97\" (UniqueName: \"kubernetes.io/projected/e8542bf7-709a-4750-9a1a-1f87ef41d2cd-kube-api-access-jrc97\") pod \"model-serving-api-86f7b4b499-mflqr\" (UID: \"e8542bf7-709a-4750-9a1a-1f87ef41d2cd\") " pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:25.227924 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.227897 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8542bf7-709a-4750-9a1a-1f87ef41d2cd-tls-certs\") pod \"model-serving-api-86f7b4b499-mflqr\" (UID: \"e8542bf7-709a-4750-9a1a-1f87ef41d2cd\") " pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:25.234024 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.234002 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrc97\" (UniqueName: \"kubernetes.io/projected/e8542bf7-709a-4750-9a1a-1f87ef41d2cd-kube-api-access-jrc97\") pod \"model-serving-api-86f7b4b499-mflqr\" (UID: \"e8542bf7-709a-4750-9a1a-1f87ef41d2cd\") " pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:25.328559 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.328539 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:25.468359 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:25.468332 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mflqr"] Apr 23 16:43:25.469567 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:43:25.469539 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8542bf7_709a_4750_9a1a_1f87ef41d2cd.slice/crio-703f893683c19c4c03e4c98a41f2f191f89cb3ec78f64fb94dbdf8bf7c48a8ab WatchSource:0}: Error finding container 703f893683c19c4c03e4c98a41f2f191f89cb3ec78f64fb94dbdf8bf7c48a8ab: Status 404 returned error can't find the container with id 703f893683c19c4c03e4c98a41f2f191f89cb3ec78f64fb94dbdf8bf7c48a8ab Apr 23 16:43:26.357176 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:26.357146 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mflqr" event={"ID":"e8542bf7-709a-4750-9a1a-1f87ef41d2cd","Type":"ContainerStarted","Data":"703f893683c19c4c03e4c98a41f2f191f89cb3ec78f64fb94dbdf8bf7c48a8ab"} Apr 23 16:43:27.362785 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:27.362748 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mflqr" event={"ID":"e8542bf7-709a-4750-9a1a-1f87ef41d2cd","Type":"ContainerStarted","Data":"bcabc09ccd851efeb63872e88738ec7c276050efd992d2add0f20d5da917160f"} Apr 23 16:43:27.363220 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:27.362911 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:43:27.384841 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:27.384748 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-mflqr" podStartSLOduration=1.731914 podStartE2EDuration="3.384734755s" podCreationTimestamp="2026-04-23 16:43:24 +0000 UTC" firstStartedPulling="2026-04-23 16:43:25.471572121 +0000 UTC m=+500.025006272" lastFinishedPulling="2026-04-23 16:43:27.12439288 +0000 UTC m=+501.677827027" observedRunningTime="2026-04-23 16:43:27.383033418 +0000 UTC m=+501.936467589" watchObservedRunningTime="2026-04-23 16:43:27.384734755 +0000 UTC m=+501.938168924" Apr 23 16:43:38.369676 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:43:38.369648 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-mflqr" Apr 23 16:44:00.970628 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:00.970594 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7"] Apr 23 16:44:00.973574 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:00.973555 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" Apr 23 16:44:00.976404 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:00.976380 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-swd2s\"" Apr 23 16:44:00.982155 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:00.982139 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" Apr 23 16:44:00.985224 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:00.985205 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7"] Apr 23 16:44:01.103324 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.103266 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7"] Apr 23 16:44:01.105577 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:44:01.105551 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41d37c8_4c2f_41c7_bf52_a20aa8b68262.slice/crio-73a3f25348e302435439bff19e7d7249748d711a32320e8d20ddca53afbaad65 WatchSource:0}: Error finding container 73a3f25348e302435439bff19e7d7249748d711a32320e8d20ddca53afbaad65: Status 404 returned error can't find the container with id 73a3f25348e302435439bff19e7d7249748d711a32320e8d20ddca53afbaad65 Apr 23 16:44:01.219890 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.219857 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m"] Apr 23 16:44:01.224191 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.224138 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:44:01.231438 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.231394 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m"] Apr 23 16:44:01.369982 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.369957 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8d75948-b0cc-4c70-92db-d2d2455d34c9-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m\" (UID: \"a8d75948-b0cc-4c70-92db-d2d2455d34c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:44:01.447691 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.447658 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" event={"ID":"c41d37c8-4c2f-41c7-bf52-a20aa8b68262","Type":"ContainerStarted","Data":"73a3f25348e302435439bff19e7d7249748d711a32320e8d20ddca53afbaad65"} Apr 23 16:44:01.471112 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.471077 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8d75948-b0cc-4c70-92db-d2d2455d34c9-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m\" (UID: \"a8d75948-b0cc-4c70-92db-d2d2455d34c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:44:01.471430 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.471413 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8d75948-b0cc-4c70-92db-d2d2455d34c9-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m\" (UID: \"a8d75948-b0cc-4c70-92db-d2d2455d34c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:44:01.536491 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.536471 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:44:01.568410 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.568381 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx"] Apr 23 16:44:01.572464 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.572444 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:44:01.582469 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.582339 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx"] Apr 23 16:44:01.657453 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.657272 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m"] Apr 23 16:44:01.659459 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:44:01.659432 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8d75948_b0cc_4c70_92db_d2d2455d34c9.slice/crio-4412f0be5d904169764022633a1a15972490465731f81918f1fa2b689583efc9 WatchSource:0}: Error finding container 4412f0be5d904169764022633a1a15972490465731f81918f1fa2b689583efc9: Status 404 returned error can't find the container with id 4412f0be5d904169764022633a1a15972490465731f81918f1fa2b689583efc9 Apr 23 16:44:01.672705 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.672683 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78cea54-3e02-4b9e-9889-6887c5804a7f-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-597b747667-ct7xx\" (UID: \"b78cea54-3e02-4b9e-9889-6887c5804a7f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:44:01.774175 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.774137 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78cea54-3e02-4b9e-9889-6887c5804a7f-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-597b747667-ct7xx\" (UID: \"b78cea54-3e02-4b9e-9889-6887c5804a7f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:44:01.774601 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.774576 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78cea54-3e02-4b9e-9889-6887c5804a7f-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-597b747667-ct7xx\" (UID: \"b78cea54-3e02-4b9e-9889-6887c5804a7f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:44:01.884754 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:01.884284 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:44:02.065627 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:02.065586 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx"] Apr 23 16:44:02.454704 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:02.454653 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" event={"ID":"b78cea54-3e02-4b9e-9889-6887c5804a7f","Type":"ContainerStarted","Data":"374516b6da3a62d505fcb2bc12ebeebd8346dc60023b08b0ae5d37e373b8dd17"} Apr 23 16:44:02.456195 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:02.456155 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" event={"ID":"a8d75948-b0cc-4c70-92db-d2d2455d34c9","Type":"ContainerStarted","Data":"4412f0be5d904169764022633a1a15972490465731f81918f1fa2b689583efc9"} Apr 23 16:44:14.498271 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:14.498235 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" event={"ID":"c41d37c8-4c2f-41c7-bf52-a20aa8b68262","Type":"ContainerStarted","Data":"1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f"} Apr 23 16:44:14.498686 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:14.498410 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" Apr 23 16:44:14.499711 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:14.499627 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 16:44:14.499711 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:14.499679 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" event={"ID":"b78cea54-3e02-4b9e-9889-6887c5804a7f","Type":"ContainerStarted","Data":"ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743"} Apr 23 16:44:14.501015 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:14.500995 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" event={"ID":"a8d75948-b0cc-4c70-92db-d2d2455d34c9","Type":"ContainerStarted","Data":"4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda"} Apr 23 16:44:14.514102 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:14.514053 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" podStartSLOduration=1.25610846 podStartE2EDuration="14.514041522s" podCreationTimestamp="2026-04-23 16:44:00 +0000 UTC" firstStartedPulling="2026-04-23 16:44:01.107339442 +0000 UTC m=+535.660773594" lastFinishedPulling="2026-04-23 16:44:14.365272506 +0000 UTC m=+548.918706656" observedRunningTime="2026-04-23 16:44:14.512452069 +0000 UTC m=+549.065886239" watchObservedRunningTime="2026-04-23 16:44:14.514041522 +0000 UTC m=+549.067475691" Apr 23 16:44:15.503404 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:15.503363 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 16:44:18.511128 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:18.511095 2561 generic.go:358] "Generic (PLEG): container finished" podID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerID="ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743" exitCode=0 Apr 23 16:44:18.511537 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:18.511165 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" event={"ID":"b78cea54-3e02-4b9e-9889-6887c5804a7f","Type":"ContainerDied","Data":"ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743"} Apr 23 16:44:18.512581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:18.512490 2561 generic.go:358] "Generic (PLEG): container finished" podID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerID="4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda" exitCode=0 Apr 23 16:44:18.512581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:18.512558 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" event={"ID":"a8d75948-b0cc-4c70-92db-d2d2455d34c9","Type":"ContainerDied","Data":"4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda"} Apr 23 16:44:25.504136 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.504050 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 16:44:25.534145 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.534111 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" event={"ID":"b78cea54-3e02-4b9e-9889-6887c5804a7f","Type":"ContainerStarted","Data":"aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8"} Apr 23 16:44:25.534498 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.534466 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:44:25.535583 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.535558 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" event={"ID":"a8d75948-b0cc-4c70-92db-d2d2455d34c9","Type":"ContainerStarted","Data":"1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4"} Apr 23 16:44:25.535822 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.535801 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:44:25.535922 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.535858 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 16:44:25.536688 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.536666 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 16:44:25.549911 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.549777 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podStartSLOduration=1.413199917 podStartE2EDuration="24.549763441s" podCreationTimestamp="2026-04-23 16:44:01 +0000 UTC" firstStartedPulling="2026-04-23 16:44:02.072611245 +0000 UTC m=+536.626045397" lastFinishedPulling="2026-04-23 16:44:25.209174638 +0000 UTC m=+559.762608921" observedRunningTime="2026-04-23 16:44:25.54923825 +0000 UTC m=+560.102672430" watchObservedRunningTime="2026-04-23 16:44:25.549763441 +0000 UTC m=+560.103197607" Apr 23 16:44:25.566800 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:25.566758 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podStartSLOduration=1.013713445 podStartE2EDuration="24.56674547s" podCreationTimestamp="2026-04-23 16:44:01 +0000 UTC" firstStartedPulling="2026-04-23 16:44:01.661560464 +0000 UTC m=+536.214994611" lastFinishedPulling="2026-04-23 16:44:25.214592475 +0000 UTC m=+559.768026636" observedRunningTime="2026-04-23 16:44:25.564898536 +0000 UTC m=+560.118332699" watchObservedRunningTime="2026-04-23 16:44:25.56674547 +0000 UTC m=+560.120179639" Apr 23 16:44:26.538753 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:26.538709 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 16:44:26.539225 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:26.539115 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 16:44:35.503915 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:35.503848 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 16:44:36.539348 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:36.539310 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 16:44:36.539710 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:36.539317 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 16:44:45.503461 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:45.503414 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 16:44:46.539651 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:46.539608 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 16:44:46.540034 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:46.539608 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 16:44:55.504111 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:55.504073 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 16:44:56.539321 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:56.539285 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 16:44:56.539692 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:44:56.539289 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 16:45:05.505056 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:05.505021 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" Apr 23 16:45:06.538894 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:06.538835 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 16:45:06.539348 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:06.539259 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 16:45:16.539068 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:16.539023 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 16:45:16.539455 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:16.539233 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 16:45:21.070323 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.070280 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l"] Apr 23 16:45:21.073411 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.073394 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:21.076124 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.076103 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 16:45:21.076241 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.076103 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-5bdc4-serving-cert\"" Apr 23 16:45:21.076241 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.076100 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-5bdc4-kube-rbac-proxy-sar-config\"" Apr 23 16:45:21.086111 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.086088 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l"] Apr 23 16:45:21.250999 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.250973 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49efd145-4d17-4fa2-9101-2ba8c03d6df6-openshift-service-ca-bundle\") pod \"switch-graph-5bdc4-7467b6c785-7tk6l\" (UID: \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\") " pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:21.251159 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.251018 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49efd145-4d17-4fa2-9101-2ba8c03d6df6-proxy-tls\") pod \"switch-graph-5bdc4-7467b6c785-7tk6l\" (UID: \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\") " pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:21.352115 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.352020 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49efd145-4d17-4fa2-9101-2ba8c03d6df6-openshift-service-ca-bundle\") pod \"switch-graph-5bdc4-7467b6c785-7tk6l\" (UID: \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\") " pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:21.352115 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.352082 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49efd145-4d17-4fa2-9101-2ba8c03d6df6-proxy-tls\") pod \"switch-graph-5bdc4-7467b6c785-7tk6l\" (UID: \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\") " pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:21.352631 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.352590 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49efd145-4d17-4fa2-9101-2ba8c03d6df6-openshift-service-ca-bundle\") pod \"switch-graph-5bdc4-7467b6c785-7tk6l\" (UID: \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\") " pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:21.354436 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.354413 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49efd145-4d17-4fa2-9101-2ba8c03d6df6-proxy-tls\") pod \"switch-graph-5bdc4-7467b6c785-7tk6l\" (UID: \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\") " pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:21.383601 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.383579 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:21.497783 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.497754 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l"] Apr 23 16:45:21.500765 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:45:21.500739 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49efd145_4d17_4fa2_9101_2ba8c03d6df6.slice/crio-14705bbb8568384b97919e6ba73392904062ea9534527f1e3c2df7748d2cd309 WatchSource:0}: Error finding container 14705bbb8568384b97919e6ba73392904062ea9534527f1e3c2df7748d2cd309: Status 404 returned error can't find the container with id 14705bbb8568384b97919e6ba73392904062ea9534527f1e3c2df7748d2cd309 Apr 23 16:45:21.686475 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:21.686392 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" event={"ID":"49efd145-4d17-4fa2-9101-2ba8c03d6df6","Type":"ContainerStarted","Data":"14705bbb8568384b97919e6ba73392904062ea9534527f1e3c2df7748d2cd309"} Apr 23 16:45:24.697638 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:24.697562 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" event={"ID":"49efd145-4d17-4fa2-9101-2ba8c03d6df6","Type":"ContainerStarted","Data":"1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1"} Apr 23 16:45:24.697989 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:24.697726 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:24.716052 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:24.716007 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" podStartSLOduration=1.396860078 podStartE2EDuration="3.715993041s" podCreationTimestamp="2026-04-23 16:45:21 +0000 UTC" firstStartedPulling="2026-04-23 16:45:21.502892909 +0000 UTC m=+616.056327057" lastFinishedPulling="2026-04-23 16:45:23.822025874 +0000 UTC m=+618.375460020" observedRunningTime="2026-04-23 16:45:24.714053265 +0000 UTC m=+619.267487437" watchObservedRunningTime="2026-04-23 16:45:24.715993041 +0000 UTC m=+619.269427210" Apr 23 16:45:26.539534 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:26.539499 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 16:45:26.539898 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:26.539502 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 16:45:30.707380 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:30.707351 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:31.232247 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.232216 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l"] Apr 23 16:45:31.232441 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.232402 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" containerID="cri-o://1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1" gracePeriod=30 Apr 23 16:45:31.372851 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.372817 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7"] Apr 23 16:45:31.373093 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.373070 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" containerID="cri-o://1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f" gracePeriod=30 Apr 23 16:45:31.400780 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.400756 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7"] Apr 23 16:45:31.403834 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.403813 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" Apr 23 16:45:31.413068 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.413046 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7"] Apr 23 16:45:31.413594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.413580 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" Apr 23 16:45:31.534785 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.534757 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7"] Apr 23 16:45:31.537342 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:45:31.537317 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0728115_e6b3_4748_9502_0f79c2fba7b0.slice/crio-bda1d9cec0f385c71834403bded9f34ba3ac15ff710b3e9838dd9b1998858fb1 WatchSource:0}: Error finding container bda1d9cec0f385c71834403bded9f34ba3ac15ff710b3e9838dd9b1998858fb1: Status 404 returned error can't find the container with id bda1d9cec0f385c71834403bded9f34ba3ac15ff710b3e9838dd9b1998858fb1 Apr 23 16:45:31.718342 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.718311 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" event={"ID":"f0728115-e6b3-4748-9502-0f79c2fba7b0","Type":"ContainerStarted","Data":"218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62"} Apr 23 16:45:31.718729 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.718352 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" event={"ID":"f0728115-e6b3-4748-9502-0f79c2fba7b0","Type":"ContainerStarted","Data":"bda1d9cec0f385c71834403bded9f34ba3ac15ff710b3e9838dd9b1998858fb1"} Apr 23 16:45:31.718729 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.718633 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" Apr 23 16:45:31.719914 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.719861 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 16:45:31.735438 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:31.735393 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" podStartSLOduration=0.735379684 podStartE2EDuration="735.379684ms" podCreationTimestamp="2026-04-23 16:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:45:31.734242801 +0000 UTC m=+626.287676969" watchObservedRunningTime="2026-04-23 16:45:31.735379684 +0000 UTC m=+626.288813853" Apr 23 16:45:32.721112 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:32.721074 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 16:45:34.208224 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.208205 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" Apr 23 16:45:34.727445 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.727413 2561 generic.go:358] "Generic (PLEG): container finished" podID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerID="1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f" exitCode=0 Apr 23 16:45:34.727581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.727471 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" Apr 23 16:45:34.727581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.727496 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" event={"ID":"c41d37c8-4c2f-41c7-bf52-a20aa8b68262","Type":"ContainerDied","Data":"1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f"} Apr 23 16:45:34.727581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.727535 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7" event={"ID":"c41d37c8-4c2f-41c7-bf52-a20aa8b68262","Type":"ContainerDied","Data":"73a3f25348e302435439bff19e7d7249748d711a32320e8d20ddca53afbaad65"} Apr 23 16:45:34.727581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.727550 2561 scope.go:117] "RemoveContainer" containerID="1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f" Apr 23 16:45:34.735025 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.735004 2561 scope.go:117] "RemoveContainer" containerID="1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f" Apr 23 16:45:34.735339 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:45:34.735280 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f\": container with ID starting with 1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f not found: ID does not exist" containerID="1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f" Apr 23 16:45:34.735339 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.735322 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f"} err="failed to get container status \"1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f\": rpc error: code = NotFound desc = could not find container \"1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f\": container with ID starting with 1a1a15af192539e21054ba7e03c2b4323efc2bfdded3ab40b91cdc92aa22991f not found: ID does not exist" Apr 23 16:45:34.751398 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.748850 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7"] Apr 23 16:45:34.753359 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:34.753338 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bdc4-predictor-d6748fc7f-6x4x7"] Apr 23 16:45:35.704967 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:35.704915 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:36.031615 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:36.031586 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" path="/var/lib/kubelet/pods/c41d37c8-4c2f-41c7-bf52-a20aa8b68262/volumes" Apr 23 16:45:36.539847 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:36.539812 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:45:36.540356 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:36.540335 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:45:40.704388 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:40.704350 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:42.721866 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:42.721825 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 16:45:45.704798 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:45.704760 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:45.705262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:45.704899 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:45:50.704275 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:50.704229 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:52.721426 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:52.721382 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 16:45:55.705295 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:45:55.705255 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:00.704740 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:00.704697 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:01.119361 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.119334 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-5768cff546-4jftj"] Apr 23 16:46:01.119658 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.119643 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" Apr 23 16:46:01.119767 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.119662 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" Apr 23 16:46:01.119767 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.119737 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41d37c8-4c2f-41c7-bf52-a20aa8b68262" containerName="kserve-container" Apr 23 16:46:01.125643 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.125620 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.128111 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.128089 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 23 16:46:01.128319 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.128296 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 23 16:46:01.131276 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.131256 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5768cff546-4jftj"] Apr 23 16:46:01.217582 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.217553 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c581db6-1683-4dc5-a6dd-b0f330838a9d-proxy-tls\") pod \"model-chainer-5768cff546-4jftj\" (UID: \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\") " pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.217704 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.217610 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c581db6-1683-4dc5-a6dd-b0f330838a9d-openshift-service-ca-bundle\") pod \"model-chainer-5768cff546-4jftj\" (UID: \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\") " pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.318465 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.318438 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c581db6-1683-4dc5-a6dd-b0f330838a9d-proxy-tls\") pod \"model-chainer-5768cff546-4jftj\" (UID: \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\") " pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.318600 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.318504 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c581db6-1683-4dc5-a6dd-b0f330838a9d-openshift-service-ca-bundle\") pod \"model-chainer-5768cff546-4jftj\" (UID: \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\") " pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.319266 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.319240 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c581db6-1683-4dc5-a6dd-b0f330838a9d-openshift-service-ca-bundle\") pod \"model-chainer-5768cff546-4jftj\" (UID: \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\") " pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.320801 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.320775 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c581db6-1683-4dc5-a6dd-b0f330838a9d-proxy-tls\") pod \"model-chainer-5768cff546-4jftj\" (UID: \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\") " pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.368087 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.368069 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:46:01.419382 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.419327 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49efd145-4d17-4fa2-9101-2ba8c03d6df6-proxy-tls\") pod \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\" (UID: \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\") " Apr 23 16:46:01.419469 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.419390 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49efd145-4d17-4fa2-9101-2ba8c03d6df6-openshift-service-ca-bundle\") pod \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\" (UID: \"49efd145-4d17-4fa2-9101-2ba8c03d6df6\") " Apr 23 16:46:01.419703 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.419677 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49efd145-4d17-4fa2-9101-2ba8c03d6df6-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "49efd145-4d17-4fa2-9101-2ba8c03d6df6" (UID: "49efd145-4d17-4fa2-9101-2ba8c03d6df6"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:46:01.421371 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.421350 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49efd145-4d17-4fa2-9101-2ba8c03d6df6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "49efd145-4d17-4fa2-9101-2ba8c03d6df6" (UID: "49efd145-4d17-4fa2-9101-2ba8c03d6df6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:46:01.436504 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.436483 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.520643 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.520619 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49efd145-4d17-4fa2-9101-2ba8c03d6df6-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:46:01.520643 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.520641 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49efd145-4d17-4fa2-9101-2ba8c03d6df6-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:46:01.552690 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.552590 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5768cff546-4jftj"] Apr 23 16:46:01.804041 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.804007 2561 generic.go:358] "Generic (PLEG): container finished" podID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerID="1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1" exitCode=0 Apr 23 16:46:01.804419 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.804079 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" event={"ID":"49efd145-4d17-4fa2-9101-2ba8c03d6df6","Type":"ContainerDied","Data":"1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1"} Apr 23 16:46:01.804419 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.804119 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" event={"ID":"49efd145-4d17-4fa2-9101-2ba8c03d6df6","Type":"ContainerDied","Data":"14705bbb8568384b97919e6ba73392904062ea9534527f1e3c2df7748d2cd309"} Apr 23 16:46:01.804419 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.804142 2561 scope.go:117] "RemoveContainer" containerID="1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1" Apr 23 16:46:01.804419 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.804093 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l" Apr 23 16:46:01.805459 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.805435 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" event={"ID":"4c581db6-1683-4dc5-a6dd-b0f330838a9d","Type":"ContainerStarted","Data":"ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524"} Apr 23 16:46:01.805459 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.805464 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" event={"ID":"4c581db6-1683-4dc5-a6dd-b0f330838a9d","Type":"ContainerStarted","Data":"77b686d19f5921089b44c5c4a1fd91573c995ae96949de1bb315df092601582a"} Apr 23 16:46:01.805598 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.805574 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:01.811741 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.811622 2561 scope.go:117] "RemoveContainer" containerID="1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1" Apr 23 16:46:01.811968 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:46:01.811931 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1\": container with ID starting with 1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1 not found: ID does not exist" containerID="1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1" Apr 23 16:46:01.812060 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.811980 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1"} err="failed to get container status \"1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1\": rpc error: code = NotFound desc = could not find container \"1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1\": container with ID starting with 1e7a625fec86defaee5c01867d2c4ec0d3239c3f49d047e812a07d7cb40175c1 not found: ID does not exist" Apr 23 16:46:01.824887 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.824832 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" podStartSLOduration=0.824819902 podStartE2EDuration="824.819902ms" podCreationTimestamp="2026-04-23 16:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:46:01.823785725 +0000 UTC m=+656.377219895" watchObservedRunningTime="2026-04-23 16:46:01.824819902 +0000 UTC m=+656.378254065" Apr 23 16:46:01.837514 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.837494 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l"] Apr 23 16:46:01.842795 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:01.842774 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-5bdc4-7467b6c785-7tk6l"] Apr 23 16:46:02.032185 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:02.032157 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" path="/var/lib/kubelet/pods/49efd145-4d17-4fa2-9101-2ba8c03d6df6/volumes" Apr 23 16:46:02.721995 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:02.721957 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 16:46:07.813976 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:07.813949 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:11.204559 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.204527 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5768cff546-4jftj"] Apr 23 16:46:11.204914 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.204786 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" containerID="cri-o://ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524" gracePeriod=30 Apr 23 16:46:11.310300 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.310271 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx"] Apr 23 16:46:11.310550 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.310530 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" containerID="cri-o://aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8" gracePeriod=30 Apr 23 16:46:11.326612 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.326586 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4"] Apr 23 16:46:11.326850 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.326839 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" Apr 23 16:46:11.326916 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.326853 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" Apr 23 16:46:11.326961 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.326930 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="49efd145-4d17-4fa2-9101-2ba8c03d6df6" containerName="switch-graph-5bdc4" Apr 23 16:46:11.331340 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.331322 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" Apr 23 16:46:11.341543 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.341510 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" Apr 23 16:46:11.341543 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.341525 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4"] Apr 23 16:46:11.369015 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.368988 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m"] Apr 23 16:46:11.369365 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.369315 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" containerID="cri-o://1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4" gracePeriod=30 Apr 23 16:46:11.475264 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.475203 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4"] Apr 23 16:46:11.478319 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:46:11.478289 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb847e371_ddda_4134_9e7c_8cd9ac33a9f5.slice/crio-bd9e5cbaa7fcdf3bfd2470fb5880c4f318b4f77b455c629a0d6589146fb831bb WatchSource:0}: Error finding container bd9e5cbaa7fcdf3bfd2470fb5880c4f318b4f77b455c629a0d6589146fb831bb: Status 404 returned error can't find the container with id bd9e5cbaa7fcdf3bfd2470fb5880c4f318b4f77b455c629a0d6589146fb831bb Apr 23 16:46:11.832096 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.832050 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" event={"ID":"b847e371-ddda-4134-9e7c-8cd9ac33a9f5","Type":"ContainerStarted","Data":"3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d"} Apr 23 16:46:11.832096 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.832101 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" event={"ID":"b847e371-ddda-4134-9e7c-8cd9ac33a9f5","Type":"ContainerStarted","Data":"bd9e5cbaa7fcdf3bfd2470fb5880c4f318b4f77b455c629a0d6589146fb831bb"} Apr 23 16:46:11.832464 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.832228 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" Apr 23 16:46:11.833460 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.833436 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 16:46:11.848525 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:11.848478 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" podStartSLOduration=0.848464421 podStartE2EDuration="848.464421ms" podCreationTimestamp="2026-04-23 16:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:46:11.847942316 +0000 UTC m=+666.401376486" watchObservedRunningTime="2026-04-23 16:46:11.848464421 +0000 UTC m=+666.401898591" Apr 23 16:46:12.721336 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:12.721291 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 16:46:12.812514 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:12.812483 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:12.834579 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:12.834546 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 16:46:16.227438 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.227416 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:46:16.313017 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.312997 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:46:16.321186 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.321170 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78cea54-3e02-4b9e-9889-6887c5804a7f-kserve-provision-location\") pod \"b78cea54-3e02-4b9e-9889-6887c5804a7f\" (UID: \"b78cea54-3e02-4b9e-9889-6887c5804a7f\") " Apr 23 16:46:16.321432 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.321413 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78cea54-3e02-4b9e-9889-6887c5804a7f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b78cea54-3e02-4b9e-9889-6887c5804a7f" (UID: "b78cea54-3e02-4b9e-9889-6887c5804a7f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:46:16.421996 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.421973 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8d75948-b0cc-4c70-92db-d2d2455d34c9-kserve-provision-location\") pod \"a8d75948-b0cc-4c70-92db-d2d2455d34c9\" (UID: \"a8d75948-b0cc-4c70-92db-d2d2455d34c9\") " Apr 23 16:46:16.422117 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.422107 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b78cea54-3e02-4b9e-9889-6887c5804a7f-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:46:16.422262 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.422241 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8d75948-b0cc-4c70-92db-d2d2455d34c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8d75948-b0cc-4c70-92db-d2d2455d34c9" (UID: "a8d75948-b0cc-4c70-92db-d2d2455d34c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:46:16.522714 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.522691 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8d75948-b0cc-4c70-92db-d2d2455d34c9-kserve-provision-location\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:46:16.845413 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.845386 2561 generic.go:358] "Generic (PLEG): container finished" podID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerID="aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8" exitCode=0 Apr 23 16:46:16.845574 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.845452 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" event={"ID":"b78cea54-3e02-4b9e-9889-6887c5804a7f","Type":"ContainerDied","Data":"aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8"} Apr 23 16:46:16.845574 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.845458 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" Apr 23 16:46:16.845574 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.845481 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx" event={"ID":"b78cea54-3e02-4b9e-9889-6887c5804a7f","Type":"ContainerDied","Data":"374516b6da3a62d505fcb2bc12ebeebd8346dc60023b08b0ae5d37e373b8dd17"} Apr 23 16:46:16.845574 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.845502 2561 scope.go:117] "RemoveContainer" containerID="aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8" Apr 23 16:46:16.846830 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.846800 2561 generic.go:358] "Generic (PLEG): container finished" podID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerID="1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4" exitCode=0 Apr 23 16:46:16.846969 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.846888 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" event={"ID":"a8d75948-b0cc-4c70-92db-d2d2455d34c9","Type":"ContainerDied","Data":"1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4"} Apr 23 16:46:16.846969 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.846922 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" event={"ID":"a8d75948-b0cc-4c70-92db-d2d2455d34c9","Type":"ContainerDied","Data":"4412f0be5d904169764022633a1a15972490465731f81918f1fa2b689583efc9"} Apr 23 16:46:16.846969 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.846897 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m" Apr 23 16:46:16.852747 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.852730 2561 scope.go:117] "RemoveContainer" containerID="ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743" Apr 23 16:46:16.859584 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.859569 2561 scope.go:117] "RemoveContainer" containerID="aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8" Apr 23 16:46:16.859807 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:46:16.859791 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8\": container with ID starting with aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8 not found: ID does not exist" containerID="aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8" Apr 23 16:46:16.859849 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.859814 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8"} err="failed to get container status \"aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8\": rpc error: code = NotFound desc = could not find container \"aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8\": container with ID starting with aa7eaf73d51b9a1279394d0cd97a75977b2257aaeb49d1b9efd7221138fd47b8 not found: ID does not exist" Apr 23 16:46:16.859849 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.859828 2561 scope.go:117] "RemoveContainer" containerID="ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743" Apr 23 16:46:16.860038 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:46:16.860020 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743\": container with ID starting with ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743 not found: ID does not exist" containerID="ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743" Apr 23 16:46:16.860075 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.860045 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743"} err="failed to get container status \"ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743\": rpc error: code = NotFound desc = could not find container \"ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743\": container with ID starting with ece114c0e9b1f4f5f63f90304ba1e820aaf236a197a45cd2a741109f8faa1743 not found: ID does not exist" Apr 23 16:46:16.860075 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.860061 2561 scope.go:117] "RemoveContainer" containerID="1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4" Apr 23 16:46:16.865930 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.865915 2561 scope.go:117] "RemoveContainer" containerID="4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda" Apr 23 16:46:16.872297 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.872276 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m"] Apr 23 16:46:16.872710 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.872697 2561 scope.go:117] "RemoveContainer" containerID="1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4" Apr 23 16:46:16.872963 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:46:16.872945 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4\": container with ID starting with 1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4 not found: ID does not exist" containerID="1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4" Apr 23 16:46:16.873016 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.872969 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4"} err="failed to get container status \"1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4\": rpc error: code = NotFound desc = could not find container \"1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4\": container with ID starting with 1fce7046d8b4b4513fe1f443f0595e07ed1d9a8d91457bc6db697c6dc3d112b4 not found: ID does not exist" Apr 23 16:46:16.873016 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.872983 2561 scope.go:117] "RemoveContainer" containerID="4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda" Apr 23 16:46:16.873199 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:46:16.873182 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda\": container with ID starting with 4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda not found: ID does not exist" containerID="4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda" Apr 23 16:46:16.873239 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.873204 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda"} err="failed to get container status \"4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda\": rpc error: code = NotFound desc = could not find container \"4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda\": container with ID starting with 4db9f3c5e796a2bf75a404126261dee327edd4e53f74421a4f3a60c35b478fda not found: ID does not exist" Apr 23 16:46:16.875699 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.875682 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-76596dd59f-mfr6m"] Apr 23 16:46:16.887282 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.887261 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx"] Apr 23 16:46:16.891449 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:16.891430 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-597b747667-ct7xx"] Apr 23 16:46:17.812367 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:17.812335 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:18.031505 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:18.031475 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" path="/var/lib/kubelet/pods/a8d75948-b0cc-4c70-92db-d2d2455d34c9/volumes" Apr 23 16:46:18.031807 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:18.031795 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" path="/var/lib/kubelet/pods/b78cea54-3e02-4b9e-9889-6887c5804a7f/volumes" Apr 23 16:46:22.722955 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:22.722930 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" Apr 23 16:46:22.812366 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:22.812334 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:22.812533 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:22.812444 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:22.835367 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:22.835339 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 16:46:27.812558 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:27.812511 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:32.812418 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:32.812378 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:32.835466 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:32.835435 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 16:46:37.813067 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:37.813032 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:41.375580 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.375561 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:41.488968 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.486642 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2"] Apr 23 16:46:41.488968 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.487277 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="storage-initializer" Apr 23 16:46:41.488968 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.487299 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="storage-initializer" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489267 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489291 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489312 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489322 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489343 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489353 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489364 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="storage-initializer" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489373 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="storage-initializer" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489516 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b78cea54-3e02-4b9e-9889-6887c5804a7f" containerName="kserve-container" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489535 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8d75948-b0cc-4c70-92db-d2d2455d34c9" containerName="kserve-container" Apr 23 16:46:41.490791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.489546 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerName="model-chainer" Apr 23 16:46:41.492829 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.492742 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c581db6-1683-4dc5-a6dd-b0f330838a9d-proxy-tls\") pod \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\" (UID: \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\") " Apr 23 16:46:41.492829 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.492778 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c581db6-1683-4dc5-a6dd-b0f330838a9d-openshift-service-ca-bundle\") pod \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\" (UID: \"4c581db6-1683-4dc5-a6dd-b0f330838a9d\") " Apr 23 16:46:41.493156 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.493128 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:41.493246 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.493151 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c581db6-1683-4dc5-a6dd-b0f330838a9d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4c581db6-1683-4dc5-a6dd-b0f330838a9d" (UID: "4c581db6-1683-4dc5-a6dd-b0f330838a9d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:46:41.494848 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.494821 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c581db6-1683-4dc5-a6dd-b0f330838a9d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4c581db6-1683-4dc5-a6dd-b0f330838a9d" (UID: "4c581db6-1683-4dc5-a6dd-b0f330838a9d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:46:41.496354 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.496336 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-be540-serving-cert\"" Apr 23 16:46:41.496618 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.496602 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-be540-kube-rbac-proxy-sar-config\"" Apr 23 16:46:41.503282 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.503258 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2"] Apr 23 16:46:41.593536 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.593500 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls\") pod \"switch-graph-be540-765869d5bd-xhdh2\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:41.593688 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.593557 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b5e2a72-ff54-4967-8fee-148b24ebf2be-openshift-service-ca-bundle\") pod \"switch-graph-be540-765869d5bd-xhdh2\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:41.593688 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.593633 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c581db6-1683-4dc5-a6dd-b0f330838a9d-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:46:41.593688 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.593651 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c581db6-1683-4dc5-a6dd-b0f330838a9d-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:46:41.694459 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.694426 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b5e2a72-ff54-4967-8fee-148b24ebf2be-openshift-service-ca-bundle\") pod \"switch-graph-be540-765869d5bd-xhdh2\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:41.694601 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.694483 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls\") pod \"switch-graph-be540-765869d5bd-xhdh2\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:41.694601 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:46:41.694579 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-be540-serving-cert: secret "switch-graph-be540-serving-cert" not found Apr 23 16:46:41.694673 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:46:41.694647 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls podName:9b5e2a72-ff54-4967-8fee-148b24ebf2be nodeName:}" failed. No retries permitted until 2026-04-23 16:46:42.19462788 +0000 UTC m=+696.748062027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls") pod "switch-graph-be540-765869d5bd-xhdh2" (UID: "9b5e2a72-ff54-4967-8fee-148b24ebf2be") : secret "switch-graph-be540-serving-cert" not found Apr 23 16:46:41.695053 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.695034 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b5e2a72-ff54-4967-8fee-148b24ebf2be-openshift-service-ca-bundle\") pod \"switch-graph-be540-765869d5bd-xhdh2\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:41.915415 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.915387 2561 generic.go:358] "Generic (PLEG): container finished" podID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" containerID="ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524" exitCode=137 Apr 23 16:46:41.915549 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.915465 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" Apr 23 16:46:41.915549 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.915475 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" event={"ID":"4c581db6-1683-4dc5-a6dd-b0f330838a9d","Type":"ContainerDied","Data":"ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524"} Apr 23 16:46:41.915549 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.915516 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5768cff546-4jftj" event={"ID":"4c581db6-1683-4dc5-a6dd-b0f330838a9d","Type":"ContainerDied","Data":"77b686d19f5921089b44c5c4a1fd91573c995ae96949de1bb315df092601582a"} Apr 23 16:46:41.915549 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.915533 2561 scope.go:117] "RemoveContainer" containerID="ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524" Apr 23 16:46:41.922923 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.922903 2561 scope.go:117] "RemoveContainer" containerID="ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524" Apr 23 16:46:41.923206 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:46:41.923187 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524\": container with ID starting with ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524 not found: ID does not exist" containerID="ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524" Apr 23 16:46:41.923255 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.923214 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524"} err="failed to get container status \"ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524\": rpc error: code = NotFound desc = could not find container \"ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524\": container with ID starting with ed7254e2d8e40fa9c405f9dd191a98b8d47c345d904d669fc27a069838eb9524 not found: ID does not exist" Apr 23 16:46:41.935199 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.935176 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5768cff546-4jftj"] Apr 23 16:46:41.939065 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:41.939047 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5768cff546-4jftj"] Apr 23 16:46:42.030850 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.030826 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c581db6-1683-4dc5-a6dd-b0f330838a9d" path="/var/lib/kubelet/pods/4c581db6-1683-4dc5-a6dd-b0f330838a9d/volumes" Apr 23 16:46:42.199335 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.199258 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls\") pod \"switch-graph-be540-765869d5bd-xhdh2\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:42.201510 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.201494 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls\") pod \"switch-graph-be540-765869d5bd-xhdh2\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:42.408097 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.408074 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:42.520712 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.520689 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2"] Apr 23 16:46:42.522949 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:46:42.522924 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b5e2a72_ff54_4967_8fee_148b24ebf2be.slice/crio-43dbe60d55dc2e99e938274608560bf5ec1a9e2ac95286dfa147877b380c8a31 WatchSource:0}: Error finding container 43dbe60d55dc2e99e938274608560bf5ec1a9e2ac95286dfa147877b380c8a31: Status 404 returned error can't find the container with id 43dbe60d55dc2e99e938274608560bf5ec1a9e2ac95286dfa147877b380c8a31 Apr 23 16:46:42.834658 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.834621 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 16:46:42.920028 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.919999 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" event={"ID":"9b5e2a72-ff54-4967-8fee-148b24ebf2be","Type":"ContainerStarted","Data":"82dd2de38779093611184110a2121c9cd23d53e20940b1ca79ea8d6f4756384c"} Apr 23 16:46:42.920028 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.920028 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" event={"ID":"9b5e2a72-ff54-4967-8fee-148b24ebf2be","Type":"ContainerStarted","Data":"43dbe60d55dc2e99e938274608560bf5ec1a9e2ac95286dfa147877b380c8a31"} Apr 23 16:46:42.920314 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:42.920141 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:48.929340 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:48.929314 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:46:48.946206 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:48.946162 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" podStartSLOduration=7.946132219 podStartE2EDuration="7.946132219s" podCreationTimestamp="2026-04-23 16:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:46:42.947067349 +0000 UTC m=+697.500501517" watchObservedRunningTime="2026-04-23 16:46:48.946132219 +0000 UTC m=+703.499566388" Apr 23 16:46:52.835122 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:46:52.835078 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 16:47:02.836512 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:02.836479 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" Apr 23 16:47:21.378387 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.378356 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245"] Apr 23 16:47:21.382919 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.382902 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:21.385509 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.385489 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-39400-serving-cert\"" Apr 23 16:47:21.385594 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.385516 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-39400-kube-rbac-proxy-sar-config\"" Apr 23 16:47:21.388524 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.388501 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245"] Apr 23 16:47:21.566353 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.566327 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5029734-0ac9-4a1e-8c2a-c27358af7e09-openshift-service-ca-bundle\") pod \"sequence-graph-39400-6c4fd97fc6-7q245\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:21.566498 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.566359 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls\") pod \"sequence-graph-39400-6c4fd97fc6-7q245\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:21.667755 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.667677 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5029734-0ac9-4a1e-8c2a-c27358af7e09-openshift-service-ca-bundle\") pod \"sequence-graph-39400-6c4fd97fc6-7q245\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:21.667755 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.667713 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls\") pod \"sequence-graph-39400-6c4fd97fc6-7q245\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:21.667952 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:47:21.667934 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-39400-serving-cert: secret "sequence-graph-39400-serving-cert" not found Apr 23 16:47:21.668063 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:47:21.668051 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls podName:f5029734-0ac9-4a1e-8c2a-c27358af7e09 nodeName:}" failed. No retries permitted until 2026-04-23 16:47:22.168023146 +0000 UTC m=+736.721457295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls") pod "sequence-graph-39400-6c4fd97fc6-7q245" (UID: "f5029734-0ac9-4a1e-8c2a-c27358af7e09") : secret "sequence-graph-39400-serving-cert" not found Apr 23 16:47:21.668557 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:21.668539 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5029734-0ac9-4a1e-8c2a-c27358af7e09-openshift-service-ca-bundle\") pod \"sequence-graph-39400-6c4fd97fc6-7q245\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:22.171488 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:22.171441 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls\") pod \"sequence-graph-39400-6c4fd97fc6-7q245\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:22.173921 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:22.173901 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls\") pod \"sequence-graph-39400-6c4fd97fc6-7q245\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:22.293703 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:22.293678 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:22.410183 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:22.410150 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245"] Apr 23 16:47:22.412927 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:47:22.412898 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5029734_0ac9_4a1e_8c2a_c27358af7e09.slice/crio-9f7b6227f5e5392327ad1f5a237a7d188f8dea5f815cbaecf6c1ccfade87d1f0 WatchSource:0}: Error finding container 9f7b6227f5e5392327ad1f5a237a7d188f8dea5f815cbaecf6c1ccfade87d1f0: Status 404 returned error can't find the container with id 9f7b6227f5e5392327ad1f5a237a7d188f8dea5f815cbaecf6c1ccfade87d1f0 Apr 23 16:47:22.414700 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:22.414681 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:47:23.031948 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:23.031913 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" event={"ID":"f5029734-0ac9-4a1e-8c2a-c27358af7e09","Type":"ContainerStarted","Data":"99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae"} Apr 23 16:47:23.031948 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:23.031947 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" event={"ID":"f5029734-0ac9-4a1e-8c2a-c27358af7e09","Type":"ContainerStarted","Data":"9f7b6227f5e5392327ad1f5a237a7d188f8dea5f815cbaecf6c1ccfade87d1f0"} Apr 23 16:47:23.032163 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:23.032041 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:47:23.051402 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:23.051358 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" podStartSLOduration=2.051342712 podStartE2EDuration="2.051342712s" podCreationTimestamp="2026-04-23 16:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:47:23.049991312 +0000 UTC m=+737.603425479" watchObservedRunningTime="2026-04-23 16:47:23.051342712 +0000 UTC m=+737.604776880" Apr 23 16:47:29.039484 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:47:29.039455 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:54:56.236083 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.235853 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2"] Apr 23 16:54:56.238623 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.236220 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" containerID="cri-o://82dd2de38779093611184110a2121c9cd23d53e20940b1ca79ea8d6f4756384c" gracePeriod=30 Apr 23 16:54:56.334698 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.334666 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7"] Apr 23 16:54:56.334968 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.334942 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" containerID="cri-o://218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62" gracePeriod=30 Apr 23 16:54:56.398653 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.398618 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6"] Apr 23 16:54:56.401560 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.401545 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" Apr 23 16:54:56.411037 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.411015 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" Apr 23 16:54:56.419365 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.419337 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6"] Apr 23 16:54:56.533149 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.533086 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6"] Apr 23 16:54:56.538235 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:54:56.538208 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7fe4523_9d21_4b07_9037_141b73c6c9ea.slice/crio-811cacd25852f35c82973ae2f1400e22d1cb08bb99c42cac7c5dc7a8addb812b WatchSource:0}: Error finding container 811cacd25852f35c82973ae2f1400e22d1cb08bb99c42cac7c5dc7a8addb812b: Status 404 returned error can't find the container with id 811cacd25852f35c82973ae2f1400e22d1cb08bb99c42cac7c5dc7a8addb812b Apr 23 16:54:56.540009 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:56.539992 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:54:57.201120 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:57.201087 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" event={"ID":"a7fe4523-9d21-4b07-9037-141b73c6c9ea","Type":"ContainerStarted","Data":"c4d73fa52856389287c58f685d65493640a4c1b2344cd7777931637f8ff23f14"} Apr 23 16:54:57.201120 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:57.201123 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" event={"ID":"a7fe4523-9d21-4b07-9037-141b73c6c9ea","Type":"ContainerStarted","Data":"811cacd25852f35c82973ae2f1400e22d1cb08bb99c42cac7c5dc7a8addb812b"} Apr 23 16:54:57.201341 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:57.201263 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" Apr 23 16:54:57.202542 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:57.202518 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 16:54:57.216507 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:57.216466 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podStartSLOduration=1.216451758 podStartE2EDuration="1.216451758s" podCreationTimestamp="2026-04-23 16:54:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:57.215512764 +0000 UTC m=+1191.768946934" watchObservedRunningTime="2026-04-23 16:54:57.216451758 +0000 UTC m=+1191.769885928" Apr 23 16:54:58.204741 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:58.204709 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 16:54:58.926849 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:58.926813 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:54:59.173360 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.173337 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" Apr 23 16:54:59.208531 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.208448 2561 generic.go:358] "Generic (PLEG): container finished" podID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerID="218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62" exitCode=0 Apr 23 16:54:59.208531 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.208507 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" Apr 23 16:54:59.209028 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.208517 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" event={"ID":"f0728115-e6b3-4748-9502-0f79c2fba7b0","Type":"ContainerDied","Data":"218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62"} Apr 23 16:54:59.209028 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.208570 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7" event={"ID":"f0728115-e6b3-4748-9502-0f79c2fba7b0","Type":"ContainerDied","Data":"bda1d9cec0f385c71834403bded9f34ba3ac15ff710b3e9838dd9b1998858fb1"} Apr 23 16:54:59.209028 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.208593 2561 scope.go:117] "RemoveContainer" containerID="218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62" Apr 23 16:54:59.216283 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.216268 2561 scope.go:117] "RemoveContainer" containerID="218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62" Apr 23 16:54:59.216504 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:54:59.216484 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62\": container with ID starting with 218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62 not found: ID does not exist" containerID="218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62" Apr 23 16:54:59.216571 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.216517 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62"} err="failed to get container status \"218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62\": rpc error: code = NotFound desc = could not find container \"218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62\": container with ID starting with 218861abf0cdf055fd5f51f8b3a9a2045fa56bf7702033fc54e2bf74101f7f62 not found: ID does not exist" Apr 23 16:54:59.230763 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.230743 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7"] Apr 23 16:54:59.233862 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:54:59.233838 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be540-predictor-5b5665dc9-8cfm7"] Apr 23 16:55:00.031406 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:00.031372 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" path="/var/lib/kubelet/pods/f0728115-e6b3-4748-9502-0f79c2fba7b0/volumes" Apr 23 16:55:03.927168 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:03.927129 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:08.205630 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:08.205578 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 16:55:08.927709 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:08.927674 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:08.927904 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:08.927773 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:55:13.927504 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:13.927470 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:18.205219 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:18.205179 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 16:55:18.927418 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:18.927381 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:23.927397 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:23.927360 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:26.280314 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:26.280288 2561 generic.go:358] "Generic (PLEG): container finished" podID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerID="82dd2de38779093611184110a2121c9cd23d53e20940b1ca79ea8d6f4756384c" exitCode=0 Apr 23 16:55:26.280587 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:26.280342 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" event={"ID":"9b5e2a72-ff54-4967-8fee-148b24ebf2be","Type":"ContainerDied","Data":"82dd2de38779093611184110a2121c9cd23d53e20940b1ca79ea8d6f4756384c"} Apr 23 16:55:26.870028 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:26.870008 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:55:26.928277 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:26.928250 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls\") pod \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " Apr 23 16:55:26.928394 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:26.928353 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b5e2a72-ff54-4967-8fee-148b24ebf2be-openshift-service-ca-bundle\") pod \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\" (UID: \"9b5e2a72-ff54-4967-8fee-148b24ebf2be\") " Apr 23 16:55:26.928661 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:26.928636 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b5e2a72-ff54-4967-8fee-148b24ebf2be-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9b5e2a72-ff54-4967-8fee-148b24ebf2be" (UID: "9b5e2a72-ff54-4967-8fee-148b24ebf2be"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:55:26.930094 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:26.930073 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9b5e2a72-ff54-4967-8fee-148b24ebf2be" (UID: "9b5e2a72-ff54-4967-8fee-148b24ebf2be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:55:27.029325 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:27.029303 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b5e2a72-ff54-4967-8fee-148b24ebf2be-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:55:27.029325 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:27.029325 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2a72-ff54-4967-8fee-148b24ebf2be-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:55:27.283996 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:27.283928 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" event={"ID":"9b5e2a72-ff54-4967-8fee-148b24ebf2be","Type":"ContainerDied","Data":"43dbe60d55dc2e99e938274608560bf5ec1a9e2ac95286dfa147877b380c8a31"} Apr 23 16:55:27.283996 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:27.283957 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2" Apr 23 16:55:27.283996 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:27.283977 2561 scope.go:117] "RemoveContainer" containerID="82dd2de38779093611184110a2121c9cd23d53e20940b1ca79ea8d6f4756384c" Apr 23 16:55:27.303794 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:27.303770 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2"] Apr 23 16:55:27.309622 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:27.309599 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-be540-765869d5bd-xhdh2"] Apr 23 16:55:28.031450 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:28.031416 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" path="/var/lib/kubelet/pods/9b5e2a72-ff54-4967-8fee-148b24ebf2be/volumes" Apr 23 16:55:28.205016 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:28.204976 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 16:55:36.106220 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.106191 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245"] Apr 23 16:55:36.106654 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.106473 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" containerID="cri-o://99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae" gracePeriod=30 Apr 23 16:55:36.219045 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.219014 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4"] Apr 23 16:55:36.219328 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.219305 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" containerID="cri-o://3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d" gracePeriod=30 Apr 23 16:55:36.242070 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.242045 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg"] Apr 23 16:55:36.242444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.242317 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" Apr 23 16:55:36.242444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.242335 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" Apr 23 16:55:36.242444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.242350 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" Apr 23 16:55:36.242444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.242359 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" Apr 23 16:55:36.242444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.242427 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0728115-e6b3-4748-9502-0f79c2fba7b0" containerName="kserve-container" Apr 23 16:55:36.242444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.242437 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b5e2a72-ff54-4967-8fee-148b24ebf2be" containerName="switch-graph-be540" Apr 23 16:55:36.245106 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.245089 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" Apr 23 16:55:36.254783 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.254763 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" Apr 23 16:55:36.254911 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.254789 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg"] Apr 23 16:55:36.387091 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:36.387016 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg"] Apr 23 16:55:36.391274 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:55:36.391246 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99de4ace_3a2d_4fee_8d6f_5e6f79b6e79f.slice/crio-fe1d085dbe6efbab808e098959adfdb34104012fb1090b2ea83963b59dc29816 WatchSource:0}: Error finding container fe1d085dbe6efbab808e098959adfdb34104012fb1090b2ea83963b59dc29816: Status 404 returned error can't find the container with id fe1d085dbe6efbab808e098959adfdb34104012fb1090b2ea83963b59dc29816 Apr 23 16:55:37.312343 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:37.312302 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" event={"ID":"99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f","Type":"ContainerStarted","Data":"69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716"} Apr 23 16:55:37.312343 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:37.312342 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" event={"ID":"99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f","Type":"ContainerStarted","Data":"fe1d085dbe6efbab808e098959adfdb34104012fb1090b2ea83963b59dc29816"} Apr 23 16:55:37.312763 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:37.312514 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" Apr 23 16:55:37.313748 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:37.313726 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 16:55:37.328115 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:37.328075 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podStartSLOduration=1.328063023 podStartE2EDuration="1.328063023s" podCreationTimestamp="2026-04-23 16:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:55:37.32663347 +0000 UTC m=+1231.880067642" watchObservedRunningTime="2026-04-23 16:55:37.328063023 +0000 UTC m=+1231.881497191" Apr 23 16:55:38.205748 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:38.205708 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 16:55:38.315025 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:38.314988 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 16:55:39.038157 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:39.038120 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:39.655018 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:39.654997 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" Apr 23 16:55:40.321477 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.321396 2561 generic.go:358] "Generic (PLEG): container finished" podID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerID="3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d" exitCode=0 Apr 23 16:55:40.321477 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.321453 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" event={"ID":"b847e371-ddda-4134-9e7c-8cd9ac33a9f5","Type":"ContainerDied","Data":"3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d"} Apr 23 16:55:40.321477 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.321458 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" Apr 23 16:55:40.321477 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.321477 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4" event={"ID":"b847e371-ddda-4134-9e7c-8cd9ac33a9f5","Type":"ContainerDied","Data":"bd9e5cbaa7fcdf3bfd2470fb5880c4f318b4f77b455c629a0d6589146fb831bb"} Apr 23 16:55:40.321751 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.321492 2561 scope.go:117] "RemoveContainer" containerID="3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d" Apr 23 16:55:40.328836 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.328819 2561 scope.go:117] "RemoveContainer" containerID="3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d" Apr 23 16:55:40.329105 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:55:40.329085 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d\": container with ID starting with 3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d not found: ID does not exist" containerID="3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d" Apr 23 16:55:40.329169 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.329117 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d"} err="failed to get container status \"3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d\": rpc error: code = NotFound desc = could not find container \"3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d\": container with ID starting with 3903b970763fc48cb6cb84fe97ba0eea179ca8d1f0322c4dd04f80d2e8bc725d not found: ID does not exist" Apr 23 16:55:40.336306 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.336280 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4"] Apr 23 16:55:40.342223 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:40.342200 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-39400-predictor-864b86558c-nwrd4"] Apr 23 16:55:42.030784 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:42.030752 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" path="/var/lib/kubelet/pods/b847e371-ddda-4134-9e7c-8cd9ac33a9f5/volumes" Apr 23 16:55:44.038064 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:44.038026 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:48.205679 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:48.205650 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" Apr 23 16:55:48.315255 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:48.315216 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 16:55:49.038581 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:49.038546 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:49.038740 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:49.038646 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:55:54.038142 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:54.038103 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:55:58.315280 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:58.315199 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 16:55:59.038531 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:55:59.038494 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:04.039465 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:04.039432 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:06.239902 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.239864 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:56:06.398337 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.398305 2561 generic.go:358] "Generic (PLEG): container finished" podID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerID="99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae" exitCode=0 Apr 23 16:56:06.398482 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.398358 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" Apr 23 16:56:06.398482 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.398379 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" event={"ID":"f5029734-0ac9-4a1e-8c2a-c27358af7e09","Type":"ContainerDied","Data":"99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae"} Apr 23 16:56:06.398482 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.398419 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245" event={"ID":"f5029734-0ac9-4a1e-8c2a-c27358af7e09","Type":"ContainerDied","Data":"9f7b6227f5e5392327ad1f5a237a7d188f8dea5f815cbaecf6c1ccfade87d1f0"} Apr 23 16:56:06.398482 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.398443 2561 scope.go:117] "RemoveContainer" containerID="99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae" Apr 23 16:56:06.404277 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.404259 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5029734-0ac9-4a1e-8c2a-c27358af7e09-openshift-service-ca-bundle\") pod \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " Apr 23 16:56:06.404376 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.404340 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls\") pod \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\" (UID: \"f5029734-0ac9-4a1e-8c2a-c27358af7e09\") " Apr 23 16:56:06.404574 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.404552 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5029734-0ac9-4a1e-8c2a-c27358af7e09-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "f5029734-0ac9-4a1e-8c2a-c27358af7e09" (UID: "f5029734-0ac9-4a1e-8c2a-c27358af7e09"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:56:06.405850 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.405833 2561 scope.go:117] "RemoveContainer" containerID="99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae" Apr 23 16:56:06.406130 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:06.406102 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae\": container with ID starting with 99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae not found: ID does not exist" containerID="99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae" Apr 23 16:56:06.406199 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.406142 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae"} err="failed to get container status \"99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae\": rpc error: code = NotFound desc = could not find container \"99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae\": container with ID starting with 99aa2c47b2c774d5156f9bf1cb43424e09bf06e02dde18315992505cfbebf1ae not found: ID does not exist" Apr 23 16:56:06.406402 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.406386 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f5029734-0ac9-4a1e-8c2a-c27358af7e09" (UID: "f5029734-0ac9-4a1e-8c2a-c27358af7e09"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:56:06.485096 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.485066 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55"] Apr 23 16:56:06.485353 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.485330 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" Apr 23 16:56:06.485353 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.485351 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" Apr 23 16:56:06.485444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.485371 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" Apr 23 16:56:06.485444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.485377 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" Apr 23 16:56:06.485444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.485422 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" containerName="sequence-graph-39400" Apr 23 16:56:06.485444 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.485432 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b847e371-ddda-4134-9e7c-8cd9ac33a9f5" containerName="kserve-container" Apr 23 16:56:06.489662 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.489638 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:06.492160 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.492142 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-c88c1-kube-rbac-proxy-sar-config\"" Apr 23 16:56:06.492277 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.492142 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-c88c1-serving-cert\"" Apr 23 16:56:06.495393 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.495371 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55"] Apr 23 16:56:06.505327 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.505309 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5029734-0ac9-4a1e-8c2a-c27358af7e09-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:56:06.505327 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.505328 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5029734-0ac9-4a1e-8c2a-c27358af7e09-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:56:06.605791 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.605762 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls\") pod \"ensemble-graph-c88c1-79c8484c9b-hmj55\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:06.605927 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.605795 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21a3ee2f-93ab-4870-901f-2efa216407ac-openshift-service-ca-bundle\") pod \"ensemble-graph-c88c1-79c8484c9b-hmj55\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:06.706239 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.706213 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls\") pod \"ensemble-graph-c88c1-79c8484c9b-hmj55\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:06.706340 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.706255 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21a3ee2f-93ab-4870-901f-2efa216407ac-openshift-service-ca-bundle\") pod \"ensemble-graph-c88c1-79c8484c9b-hmj55\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:06.706405 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:06.706364 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-c88c1-serving-cert: secret "ensemble-graph-c88c1-serving-cert" not found Apr 23 16:56:06.706458 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:06.706423 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls podName:21a3ee2f-93ab-4870-901f-2efa216407ac nodeName:}" failed. No retries permitted until 2026-04-23 16:56:07.206404195 +0000 UTC m=+1261.759838348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls") pod "ensemble-graph-c88c1-79c8484c9b-hmj55" (UID: "21a3ee2f-93ab-4870-901f-2efa216407ac") : secret "ensemble-graph-c88c1-serving-cert" not found Apr 23 16:56:06.706844 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.706822 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21a3ee2f-93ab-4870-901f-2efa216407ac-openshift-service-ca-bundle\") pod \"ensemble-graph-c88c1-79c8484c9b-hmj55\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:06.718360 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.718336 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245"] Apr 23 16:56:06.720481 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:06.720463 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-39400-6c4fd97fc6-7q245"] Apr 23 16:56:07.210016 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:07.209978 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls\") pod \"ensemble-graph-c88c1-79c8484c9b-hmj55\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:07.210177 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:07.210141 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-c88c1-serving-cert: secret "ensemble-graph-c88c1-serving-cert" not found Apr 23 16:56:07.210217 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:07.210211 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls podName:21a3ee2f-93ab-4870-901f-2efa216407ac nodeName:}" failed. No retries permitted until 2026-04-23 16:56:08.210196237 +0000 UTC m=+1262.763630383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls") pod "ensemble-graph-c88c1-79c8484c9b-hmj55" (UID: "21a3ee2f-93ab-4870-901f-2efa216407ac") : secret "ensemble-graph-c88c1-serving-cert" not found Apr 23 16:56:08.031636 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:08.031603 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5029734-0ac9-4a1e-8c2a-c27358af7e09" path="/var/lib/kubelet/pods/f5029734-0ac9-4a1e-8c2a-c27358af7e09/volumes" Apr 23 16:56:08.216639 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:08.216609 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls\") pod \"ensemble-graph-c88c1-79c8484c9b-hmj55\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:08.218869 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:08.218852 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls\") pod \"ensemble-graph-c88c1-79c8484c9b-hmj55\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:08.299801 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:08.299713 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:08.315666 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:08.315633 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 16:56:08.415757 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:08.415730 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55"] Apr 23 16:56:08.418229 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:56:08.418201 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21a3ee2f_93ab_4870_901f_2efa216407ac.slice/crio-d7d03a6b22d00b178221e4af5564e3ae3722035feaadfa90700144be4d542c23 WatchSource:0}: Error finding container d7d03a6b22d00b178221e4af5564e3ae3722035feaadfa90700144be4d542c23: Status 404 returned error can't find the container with id d7d03a6b22d00b178221e4af5564e3ae3722035feaadfa90700144be4d542c23 Apr 23 16:56:09.408785 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:09.408746 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" event={"ID":"21a3ee2f-93ab-4870-901f-2efa216407ac","Type":"ContainerStarted","Data":"1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047"} Apr 23 16:56:09.408785 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:09.408785 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" event={"ID":"21a3ee2f-93ab-4870-901f-2efa216407ac","Type":"ContainerStarted","Data":"d7d03a6b22d00b178221e4af5564e3ae3722035feaadfa90700144be4d542c23"} Apr 23 16:56:09.409315 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:09.408906 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:09.431230 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:09.431193 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" podStartSLOduration=3.4311816459999998 podStartE2EDuration="3.431181646s" podCreationTimestamp="2026-04-23 16:56:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:56:09.429828911 +0000 UTC m=+1263.983263080" watchObservedRunningTime="2026-04-23 16:56:09.431181646 +0000 UTC m=+1263.984615811" Apr 23 16:56:15.416674 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:15.416650 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:16.560783 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.560752 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55"] Apr 23 16:56:16.561168 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.561001 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" containerID="cri-o://1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047" gracePeriod=30 Apr 23 16:56:16.705209 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.705179 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r"] Apr 23 16:56:16.708386 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.708372 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" Apr 23 16:56:16.720621 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.720481 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" Apr 23 16:56:16.720766 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.720638 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r"] Apr 23 16:56:16.748899 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.748849 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6"] Apr 23 16:56:16.749150 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.749119 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" containerID="cri-o://c4d73fa52856389287c58f685d65493640a4c1b2344cd7777931637f8ff23f14" gracePeriod=30 Apr 23 16:56:16.845171 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:16.845147 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r"] Apr 23 16:56:16.847858 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:56:16.847831 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9350d16_cbd5_49a9_a075_8fa3a8cfed3d.slice/crio-d73cf69d19c7c8d937d2d35e427675b9aebc79148b8650e4298adb83e940655a WatchSource:0}: Error finding container d73cf69d19c7c8d937d2d35e427675b9aebc79148b8650e4298adb83e940655a: Status 404 returned error can't find the container with id d73cf69d19c7c8d937d2d35e427675b9aebc79148b8650e4298adb83e940655a Apr 23 16:56:17.431502 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:17.431468 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" event={"ID":"a9350d16-cbd5-49a9-a075-8fa3a8cfed3d","Type":"ContainerStarted","Data":"1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3"} Apr 23 16:56:17.431665 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:17.431508 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" event={"ID":"a9350d16-cbd5-49a9-a075-8fa3a8cfed3d","Type":"ContainerStarted","Data":"d73cf69d19c7c8d937d2d35e427675b9aebc79148b8650e4298adb83e940655a"} Apr 23 16:56:17.431792 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:17.431760 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" Apr 23 16:56:17.432900 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:17.432860 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 16:56:17.447259 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:17.447223 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podStartSLOduration=1.447212183 podStartE2EDuration="1.447212183s" podCreationTimestamp="2026-04-23 16:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:56:17.446479897 +0000 UTC m=+1271.999914065" watchObservedRunningTime="2026-04-23 16:56:17.447212183 +0000 UTC m=+1272.000646330" Apr 23 16:56:18.205671 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:18.205634 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 16:56:18.315419 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:18.315383 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 16:56:18.434042 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:18.434008 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 16:56:19.437972 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:19.437944 2561 generic.go:358] "Generic (PLEG): container finished" podID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerID="c4d73fa52856389287c58f685d65493640a4c1b2344cd7777931637f8ff23f14" exitCode=0 Apr 23 16:56:19.438335 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:19.437998 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" event={"ID":"a7fe4523-9d21-4b07-9037-141b73c6c9ea","Type":"ContainerDied","Data":"c4d73fa52856389287c58f685d65493640a4c1b2344cd7777931637f8ff23f14"} Apr 23 16:56:19.438409 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:19.438388 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 16:56:19.489204 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:19.489182 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" Apr 23 16:56:20.415963 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:20.415885 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:20.441273 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:20.441244 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" event={"ID":"a7fe4523-9d21-4b07-9037-141b73c6c9ea","Type":"ContainerDied","Data":"811cacd25852f35c82973ae2f1400e22d1cb08bb99c42cac7c5dc7a8addb812b"} Apr 23 16:56:20.441642 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:20.441281 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6" Apr 23 16:56:20.441642 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:20.441283 2561 scope.go:117] "RemoveContainer" containerID="c4d73fa52856389287c58f685d65493640a4c1b2344cd7777931637f8ff23f14" Apr 23 16:56:20.463053 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:20.463027 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6"] Apr 23 16:56:20.467347 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:20.467327 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c88c1-predictor-5c979449cc-hqsj6"] Apr 23 16:56:22.031205 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:22.031177 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" path="/var/lib/kubelet/pods/a7fe4523-9d21-4b07-9037-141b73c6c9ea/volumes" Apr 23 16:56:25.415747 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:25.415711 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:28.316510 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:28.316482 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" Apr 23 16:56:29.438819 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:29.438780 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 16:56:30.415663 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:30.415630 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:30.415822 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:30.415725 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:35.415965 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:35.415925 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:39.438862 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:39.438817 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 16:56:40.416243 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:40.416205 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:45.415839 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:45.415803 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:46.319712 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.319679 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk"] Apr 23 16:56:46.319980 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.319967 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" Apr 23 16:56:46.319980 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.319981 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" Apr 23 16:56:46.320076 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.320028 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7fe4523-9d21-4b07-9037-141b73c6c9ea" containerName="kserve-container" Apr 23 16:56:46.322886 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.322860 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:46.330452 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.330432 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-65292-serving-cert\"" Apr 23 16:56:46.330890 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.330862 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-65292-kube-rbac-proxy-sar-config\"" Apr 23 16:56:46.346685 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.346663 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk"] Apr 23 16:56:46.384897 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.384850 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-openshift-service-ca-bundle\") pod \"sequence-graph-65292-5cfbf48c68-6gtrk\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:46.385043 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.384940 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls\") pod \"sequence-graph-65292-5cfbf48c68-6gtrk\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:46.485524 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.485487 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-openshift-service-ca-bundle\") pod \"sequence-graph-65292-5cfbf48c68-6gtrk\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:46.485977 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.485645 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls\") pod \"sequence-graph-65292-5cfbf48c68-6gtrk\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:46.485977 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:46.485792 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-65292-serving-cert: secret "sequence-graph-65292-serving-cert" not found Apr 23 16:56:46.485977 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:46.485868 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls podName:c29c107e-4f14-49ee-963c-c1e0bffd6fdb nodeName:}" failed. No retries permitted until 2026-04-23 16:56:46.9858443 +0000 UTC m=+1301.539278454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls") pod "sequence-graph-65292-5cfbf48c68-6gtrk" (UID: "c29c107e-4f14-49ee-963c-c1e0bffd6fdb") : secret "sequence-graph-65292-serving-cert" not found Apr 23 16:56:46.486318 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.486297 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-openshift-service-ca-bundle\") pod \"sequence-graph-65292-5cfbf48c68-6gtrk\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:46.989008 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.988914 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls\") pod \"sequence-graph-65292-5cfbf48c68-6gtrk\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:46.991238 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:46.991219 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls\") pod \"sequence-graph-65292-5cfbf48c68-6gtrk\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:47.183104 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.183082 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:47.232839 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.232798 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:47.291502 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.291475 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21a3ee2f-93ab-4870-901f-2efa216407ac-openshift-service-ca-bundle\") pod \"21a3ee2f-93ab-4870-901f-2efa216407ac\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " Apr 23 16:56:47.291708 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.291562 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls\") pod \"21a3ee2f-93ab-4870-901f-2efa216407ac\" (UID: \"21a3ee2f-93ab-4870-901f-2efa216407ac\") " Apr 23 16:56:47.292383 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.292343 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21a3ee2f-93ab-4870-901f-2efa216407ac-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "21a3ee2f-93ab-4870-901f-2efa216407ac" (UID: "21a3ee2f-93ab-4870-901f-2efa216407ac"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:56:47.294377 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.294351 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "21a3ee2f-93ab-4870-901f-2efa216407ac" (UID: "21a3ee2f-93ab-4870-901f-2efa216407ac"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:56:47.350423 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.350396 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk"] Apr 23 16:56:47.352834 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:56:47.352807 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc29c107e_4f14_49ee_963c_c1e0bffd6fdb.slice/crio-4d49fbc42bc7fe43371f1d11616e168d19597ffb9dfec93fe1bfc70a6a4e507e WatchSource:0}: Error finding container 4d49fbc42bc7fe43371f1d11616e168d19597ffb9dfec93fe1bfc70a6a4e507e: Status 404 returned error can't find the container with id 4d49fbc42bc7fe43371f1d11616e168d19597ffb9dfec93fe1bfc70a6a4e507e Apr 23 16:56:47.392734 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.392712 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21a3ee2f-93ab-4870-901f-2efa216407ac-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:56:47.392734 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.392735 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21a3ee2f-93ab-4870-901f-2efa216407ac-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:56:47.514149 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.514054 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" event={"ID":"c29c107e-4f14-49ee-963c-c1e0bffd6fdb","Type":"ContainerStarted","Data":"c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f"} Apr 23 16:56:47.514149 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.514099 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" event={"ID":"c29c107e-4f14-49ee-963c-c1e0bffd6fdb","Type":"ContainerStarted","Data":"4d49fbc42bc7fe43371f1d11616e168d19597ffb9dfec93fe1bfc70a6a4e507e"} Apr 23 16:56:47.514612 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.514186 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:47.515067 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.515044 2561 generic.go:358] "Generic (PLEG): container finished" podID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerID="1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047" exitCode=0 Apr 23 16:56:47.515182 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.515099 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" Apr 23 16:56:47.515182 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.515105 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" event={"ID":"21a3ee2f-93ab-4870-901f-2efa216407ac","Type":"ContainerDied","Data":"1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047"} Apr 23 16:56:47.515253 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.515189 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55" event={"ID":"21a3ee2f-93ab-4870-901f-2efa216407ac","Type":"ContainerDied","Data":"d7d03a6b22d00b178221e4af5564e3ae3722035feaadfa90700144be4d542c23"} Apr 23 16:56:47.515253 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.515204 2561 scope.go:117] "RemoveContainer" containerID="1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047" Apr 23 16:56:47.522911 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.522867 2561 scope.go:117] "RemoveContainer" containerID="1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047" Apr 23 16:56:47.523217 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:47.523197 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047\": container with ID starting with 1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047 not found: ID does not exist" containerID="1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047" Apr 23 16:56:47.523289 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.523226 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047"} err="failed to get container status \"1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047\": rpc error: code = NotFound desc = could not find container \"1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047\": container with ID starting with 1fec2eda29dce261733f7350dbdc9f3354de5916613af5b95ed39e3f4a74d047 not found: ID does not exist" Apr 23 16:56:47.531497 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.531458 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" podStartSLOduration=1.531447997 podStartE2EDuration="1.531447997s" podCreationTimestamp="2026-04-23 16:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:56:47.531039023 +0000 UTC m=+1302.084473193" watchObservedRunningTime="2026-04-23 16:56:47.531447997 +0000 UTC m=+1302.084882191" Apr 23 16:56:47.544232 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.544211 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55"] Apr 23 16:56:47.549593 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:47.549570 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c88c1-79c8484c9b-hmj55"] Apr 23 16:56:48.032904 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:48.032144 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" path="/var/lib/kubelet/pods/21a3ee2f-93ab-4870-901f-2efa216407ac/volumes" Apr 23 16:56:49.438849 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:49.438811 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 16:56:53.524404 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:53.524378 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:56:56.348609 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.348578 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk"] Apr 23 16:56:56.349024 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.348795 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" containerID="cri-o://c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f" gracePeriod=30 Apr 23 16:56:56.434370 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.434335 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg"] Apr 23 16:56:56.434681 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.434646 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" containerID="cri-o://69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716" gracePeriod=30 Apr 23 16:56:56.470943 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.470916 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds"] Apr 23 16:56:56.471181 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.471168 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" Apr 23 16:56:56.471227 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.471182 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" Apr 23 16:56:56.471260 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.471240 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="21a3ee2f-93ab-4870-901f-2efa216407ac" containerName="ensemble-graph-c88c1" Apr 23 16:56:56.475256 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.475205 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" Apr 23 16:56:56.480757 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.480735 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds"] Apr 23 16:56:56.486317 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.486301 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" Apr 23 16:56:56.607683 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:56.607613 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds"] Apr 23 16:56:56.610919 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:56:56.610856 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd948d245_4a2d_49b9_a873_32c041bc5cc0.slice/crio-50e5d9cf1688015cb6fad899cda56d223cb30ddf46cfc96a82be5da62289aefb WatchSource:0}: Error finding container 50e5d9cf1688015cb6fad899cda56d223cb30ddf46cfc96a82be5da62289aefb: Status 404 returned error can't find the container with id 50e5d9cf1688015cb6fad899cda56d223cb30ddf46cfc96a82be5da62289aefb Apr 23 16:56:57.545798 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:57.545764 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" event={"ID":"d948d245-4a2d-49b9-a873-32c041bc5cc0","Type":"ContainerStarted","Data":"5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991"} Apr 23 16:56:57.545798 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:57.545801 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" event={"ID":"d948d245-4a2d-49b9-a873-32c041bc5cc0","Type":"ContainerStarted","Data":"50e5d9cf1688015cb6fad899cda56d223cb30ddf46cfc96a82be5da62289aefb"} Apr 23 16:56:57.546281 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:57.545910 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" Apr 23 16:56:57.547324 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:57.547296 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 16:56:57.560826 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:57.560781 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" podStartSLOduration=1.560768828 podStartE2EDuration="1.560768828s" podCreationTimestamp="2026-04-23 16:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:56:57.559567762 +0000 UTC m=+1312.113001934" watchObservedRunningTime="2026-04-23 16:56:57.560768828 +0000 UTC m=+1312.114202997" Apr 23 16:56:58.315861 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:58.315819 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 16:56:58.522549 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:58.522508 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:56:58.552690 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:58.550790 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 16:56:59.371032 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.371013 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" Apr 23 16:56:59.439455 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.439389 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 16:56:59.553221 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.553192 2561 generic.go:358] "Generic (PLEG): container finished" podID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerID="69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716" exitCode=0 Apr 23 16:56:59.553547 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.553249 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" Apr 23 16:56:59.553547 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.553265 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" event={"ID":"99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f","Type":"ContainerDied","Data":"69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716"} Apr 23 16:56:59.553547 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.553298 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg" event={"ID":"99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f","Type":"ContainerDied","Data":"fe1d085dbe6efbab808e098959adfdb34104012fb1090b2ea83963b59dc29816"} Apr 23 16:56:59.553547 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.553312 2561 scope.go:117] "RemoveContainer" containerID="69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716" Apr 23 16:56:59.560521 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.560421 2561 scope.go:117] "RemoveContainer" containerID="69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716" Apr 23 16:56:59.560751 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:56:59.560728 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716\": container with ID starting with 69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716 not found: ID does not exist" containerID="69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716" Apr 23 16:56:59.560839 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.560760 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716"} err="failed to get container status \"69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716\": rpc error: code = NotFound desc = could not find container \"69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716\": container with ID starting with 69db51b436832c1ccda641971cc772c5447d53ad43a2ea35a710cf825f67a716 not found: ID does not exist" Apr 23 16:56:59.572433 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.572412 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg"] Apr 23 16:56:59.575917 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:56:59.575899 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-65292-predictor-67b8bb4fcf-clsjg"] Apr 23 16:57:00.031340 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:00.031312 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" path="/var/lib/kubelet/pods/99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f/volumes" Apr 23 16:57:03.522929 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:03.522895 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:57:08.522553 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:08.522511 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:57:08.523046 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:08.522620 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:57:08.551598 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:08.551558 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 16:57:09.440395 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:09.440359 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" Apr 23 16:57:13.523093 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:13.523051 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:57:18.523043 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:18.523007 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:57:18.551213 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:18.551180 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 16:57:23.523227 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:23.523187 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:57:26.369437 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:57:26.369401 2561 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc29c107e_4f14_49ee_963c_c1e0bffd6fdb.slice/crio-conmon-c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f.scope\": RecentStats: unable to find data in memory cache]" Apr 23 16:57:26.369731 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:57:26.369451 2561 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc29c107e_4f14_49ee_963c_c1e0bffd6fdb.slice/crio-conmon-c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f.scope\": RecentStats: unable to find data in memory cache]" Apr 23 16:57:26.482690 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.482667 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:57:26.554597 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.554566 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls\") pod \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " Apr 23 16:57:26.554742 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.554618 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-openshift-service-ca-bundle\") pod \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\" (UID: \"c29c107e-4f14-49ee-963c-c1e0bffd6fdb\") " Apr 23 16:57:26.554998 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.554976 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c29c107e-4f14-49ee-963c-c1e0bffd6fdb" (UID: "c29c107e-4f14-49ee-963c-c1e0bffd6fdb"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:57:26.556555 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.556535 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c29c107e-4f14-49ee-963c-c1e0bffd6fdb" (UID: "c29c107e-4f14-49ee-963c-c1e0bffd6fdb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:57:26.628073 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.626024 2561 generic.go:358] "Generic (PLEG): container finished" podID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerID="c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f" exitCode=0 Apr 23 16:57:26.628073 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.626118 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" event={"ID":"c29c107e-4f14-49ee-963c-c1e0bffd6fdb","Type":"ContainerDied","Data":"c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f"} Apr 23 16:57:26.628073 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.626148 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" event={"ID":"c29c107e-4f14-49ee-963c-c1e0bffd6fdb","Type":"ContainerDied","Data":"4d49fbc42bc7fe43371f1d11616e168d19597ffb9dfec93fe1bfc70a6a4e507e"} Apr 23 16:57:26.628073 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.626169 2561 scope.go:117] "RemoveContainer" containerID="c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f" Apr 23 16:57:26.628073 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.626375 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk" Apr 23 16:57:26.635173 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.635154 2561 scope.go:117] "RemoveContainer" containerID="c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f" Apr 23 16:57:26.635439 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:57:26.635419 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f\": container with ID starting with c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f not found: ID does not exist" containerID="c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f" Apr 23 16:57:26.635500 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.635445 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f"} err="failed to get container status \"c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f\": rpc error: code = NotFound desc = could not find container \"c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f\": container with ID starting with c4e14d0aaa83692125768618a6f9cc308a9a0eb0f7e0c3a0e3c478f68230051f not found: ID does not exist" Apr 23 16:57:26.647289 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.647270 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk"] Apr 23 16:57:26.650394 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.650374 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-65292-5cfbf48c68-6gtrk"] Apr 23 16:57:26.655609 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.655585 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:57:26.655609 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.655607 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c29c107e-4f14-49ee-963c-c1e0bffd6fdb-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 16:57:26.773307 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.773281 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7"] Apr 23 16:57:26.773545 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.773534 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" Apr 23 16:57:26.773584 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.773546 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" Apr 23 16:57:26.773584 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.773562 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" Apr 23 16:57:26.773584 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.773568 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" Apr 23 16:57:26.773675 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.773610 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="99de4ace-3a2d-4fee-8d6f-5e6f79b6e79f" containerName="kserve-container" Apr 23 16:57:26.773675 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.773618 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" containerName="sequence-graph-65292" Apr 23 16:57:26.777589 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.777570 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:26.780045 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.780025 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-3ffea-kube-rbac-proxy-sar-config\"" Apr 23 16:57:26.780045 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.780039 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 16:57:26.780215 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.780137 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-3ffea-serving-cert\"" Apr 23 16:57:26.785266 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.785245 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7"] Apr 23 16:57:26.856819 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.856796 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30aae997-54b9-46f6-8ff7-17f498f7bc77-openshift-service-ca-bundle\") pod \"ensemble-graph-3ffea-cfb579db8-9spn7\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:26.856956 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.856853 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls\") pod \"ensemble-graph-3ffea-cfb579db8-9spn7\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:26.957809 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.957744 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls\") pod \"ensemble-graph-3ffea-cfb579db8-9spn7\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:26.957809 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.957782 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30aae997-54b9-46f6-8ff7-17f498f7bc77-openshift-service-ca-bundle\") pod \"ensemble-graph-3ffea-cfb579db8-9spn7\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:26.957990 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:57:26.957894 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-3ffea-serving-cert: secret "ensemble-graph-3ffea-serving-cert" not found Apr 23 16:57:26.957990 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:57:26.957951 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls podName:30aae997-54b9-46f6-8ff7-17f498f7bc77 nodeName:}" failed. No retries permitted until 2026-04-23 16:57:27.457936199 +0000 UTC m=+1342.011370351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls") pod "ensemble-graph-3ffea-cfb579db8-9spn7" (UID: "30aae997-54b9-46f6-8ff7-17f498f7bc77") : secret "ensemble-graph-3ffea-serving-cert" not found Apr 23 16:57:26.958471 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:26.958454 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30aae997-54b9-46f6-8ff7-17f498f7bc77-openshift-service-ca-bundle\") pod \"ensemble-graph-3ffea-cfb579db8-9spn7\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:27.461026 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:27.460996 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls\") pod \"ensemble-graph-3ffea-cfb579db8-9spn7\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:27.463328 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:27.463304 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls\") pod \"ensemble-graph-3ffea-cfb579db8-9spn7\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:27.688122 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:27.688095 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:27.807910 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:27.807867 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7"] Apr 23 16:57:27.810991 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:57:27.810967 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30aae997_54b9_46f6_8ff7_17f498f7bc77.slice/crio-e10ab99a25a24663b58d5223994ad9d1427101e7db084d6f12fb32253128aca3 WatchSource:0}: Error finding container e10ab99a25a24663b58d5223994ad9d1427101e7db084d6f12fb32253128aca3: Status 404 returned error can't find the container with id e10ab99a25a24663b58d5223994ad9d1427101e7db084d6f12fb32253128aca3 Apr 23 16:57:28.031340 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:28.031305 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29c107e-4f14-49ee-963c-c1e0bffd6fdb" path="/var/lib/kubelet/pods/c29c107e-4f14-49ee-963c-c1e0bffd6fdb/volumes" Apr 23 16:57:28.551808 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:28.551775 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 16:57:28.634260 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:28.634227 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" event={"ID":"30aae997-54b9-46f6-8ff7-17f498f7bc77","Type":"ContainerStarted","Data":"5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100"} Apr 23 16:57:28.634260 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:28.634261 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" event={"ID":"30aae997-54b9-46f6-8ff7-17f498f7bc77","Type":"ContainerStarted","Data":"e10ab99a25a24663b58d5223994ad9d1427101e7db084d6f12fb32253128aca3"} Apr 23 16:57:28.634446 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:28.634349 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:28.651003 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:28.650962 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" podStartSLOduration=2.650949624 podStartE2EDuration="2.650949624s" podCreationTimestamp="2026-04-23 16:57:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:57:28.649405408 +0000 UTC m=+1343.202839578" watchObservedRunningTime="2026-04-23 16:57:28.650949624 +0000 UTC m=+1343.204383792" Apr 23 16:57:34.642789 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:34.642762 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 16:57:38.551670 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:38.551625 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 16:57:48.552782 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:57:48.552748 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" Apr 23 16:58:06.559227 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.559189 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh"] Apr 23 16:58:06.562485 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.562464 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:06.564867 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.564847 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-a7ecd-kube-rbac-proxy-sar-config\"" Apr 23 16:58:06.564982 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.564846 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-a7ecd-serving-cert\"" Apr 23 16:58:06.575717 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.575690 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh"] Apr 23 16:58:06.724559 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.724524 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls\") pod \"sequence-graph-a7ecd-78b4fbf76f-gbcrh\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:06.724730 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.724591 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13a91d7b-802c-46b5-855c-48425a6dc410-openshift-service-ca-bundle\") pod \"sequence-graph-a7ecd-78b4fbf76f-gbcrh\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:06.825552 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.825468 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13a91d7b-802c-46b5-855c-48425a6dc410-openshift-service-ca-bundle\") pod \"sequence-graph-a7ecd-78b4fbf76f-gbcrh\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:06.825552 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.825511 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls\") pod \"sequence-graph-a7ecd-78b4fbf76f-gbcrh\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:06.825728 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:58:06.825615 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-a7ecd-serving-cert: secret "sequence-graph-a7ecd-serving-cert" not found Apr 23 16:58:06.825728 ip-10-0-141-189 kubenswrapper[2561]: E0423 16:58:06.825674 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls podName:13a91d7b-802c-46b5-855c-48425a6dc410 nodeName:}" failed. No retries permitted until 2026-04-23 16:58:07.325655612 +0000 UTC m=+1381.879089759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls") pod "sequence-graph-a7ecd-78b4fbf76f-gbcrh" (UID: "13a91d7b-802c-46b5-855c-48425a6dc410") : secret "sequence-graph-a7ecd-serving-cert" not found Apr 23 16:58:06.826100 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:06.826080 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13a91d7b-802c-46b5-855c-48425a6dc410-openshift-service-ca-bundle\") pod \"sequence-graph-a7ecd-78b4fbf76f-gbcrh\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:07.328508 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:07.328480 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls\") pod \"sequence-graph-a7ecd-78b4fbf76f-gbcrh\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:07.330801 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:07.330779 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls\") pod \"sequence-graph-a7ecd-78b4fbf76f-gbcrh\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:07.473333 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:07.473301 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:07.586812 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:07.586736 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh"] Apr 23 16:58:07.590232 ip-10-0-141-189 kubenswrapper[2561]: W0423 16:58:07.590202 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a91d7b_802c_46b5_855c_48425a6dc410.slice/crio-60751a7a8e3f4acbd53136e6551ebfb3928070fafea8e0200a69f2d2f0ce9603 WatchSource:0}: Error finding container 60751a7a8e3f4acbd53136e6551ebfb3928070fafea8e0200a69f2d2f0ce9603: Status 404 returned error can't find the container with id 60751a7a8e3f4acbd53136e6551ebfb3928070fafea8e0200a69f2d2f0ce9603 Apr 23 16:58:07.742563 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:07.742524 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" event={"ID":"13a91d7b-802c-46b5-855c-48425a6dc410","Type":"ContainerStarted","Data":"792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68"} Apr 23 16:58:07.742563 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:07.742563 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" event={"ID":"13a91d7b-802c-46b5-855c-48425a6dc410","Type":"ContainerStarted","Data":"60751a7a8e3f4acbd53136e6551ebfb3928070fafea8e0200a69f2d2f0ce9603"} Apr 23 16:58:07.742801 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:07.742650 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 16:58:07.759112 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:07.759056 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" podStartSLOduration=1.759038093 podStartE2EDuration="1.759038093s" podCreationTimestamp="2026-04-23 16:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:58:07.758932135 +0000 UTC m=+1382.312366307" watchObservedRunningTime="2026-04-23 16:58:07.759038093 +0000 UTC m=+1382.312472263" Apr 23 16:58:13.750868 ip-10-0-141-189 kubenswrapper[2561]: I0423 16:58:13.750840 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 17:05:41.452129 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.452089 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7"] Apr 23 17:05:41.452639 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.452400 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" containerID="cri-o://5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100" gracePeriod=30 Apr 23 17:05:41.537519 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.537492 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r"] Apr 23 17:05:41.537737 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.537717 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" containerID="cri-o://1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3" gracePeriod=30 Apr 23 17:05:41.603228 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.603196 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7"] Apr 23 17:05:41.606582 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.606557 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" Apr 23 17:05:41.617856 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.617839 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" Apr 23 17:05:41.622308 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.622282 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7"] Apr 23 17:05:41.745176 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.745107 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7"] Apr 23 17:05:41.748800 ip-10-0-141-189 kubenswrapper[2561]: W0423 17:05:41.748774 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4da01992_da60_469a_8bb9_bd54a37f9e30.slice/crio-2828f392f274306ef845a4ce69315bdde5d627ac9ef5de55abbdc58ff024a450 WatchSource:0}: Error finding container 2828f392f274306ef845a4ce69315bdde5d627ac9ef5de55abbdc58ff024a450: Status 404 returned error can't find the container with id 2828f392f274306ef845a4ce69315bdde5d627ac9ef5de55abbdc58ff024a450 Apr 23 17:05:41.750987 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.750969 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:05:41.900571 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.900541 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" event={"ID":"4da01992-da60-469a-8bb9-bd54a37f9e30","Type":"ContainerStarted","Data":"7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033"} Apr 23 17:05:41.900571 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.900575 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" event={"ID":"4da01992-da60-469a-8bb9-bd54a37f9e30","Type":"ContainerStarted","Data":"2828f392f274306ef845a4ce69315bdde5d627ac9ef5de55abbdc58ff024a450"} Apr 23 17:05:41.900753 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.900739 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" Apr 23 17:05:41.901947 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.901922 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 17:05:41.920463 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:41.920424 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podStartSLOduration=0.920413623 podStartE2EDuration="920.413623ms" podCreationTimestamp="2026-04-23 17:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:05:41.919659387 +0000 UTC m=+1836.473093554" watchObservedRunningTime="2026-04-23 17:05:41.920413623 +0000 UTC m=+1836.473847792" Apr 23 17:05:42.903039 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:42.903000 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 17:05:44.386131 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.386108 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" Apr 23 17:05:44.641575 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.641533 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:05:44.908841 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.908708 2561 generic.go:358] "Generic (PLEG): container finished" podID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerID="1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3" exitCode=0 Apr 23 17:05:44.908841 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.908812 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" event={"ID":"a9350d16-cbd5-49a9-a075-8fa3a8cfed3d","Type":"ContainerDied","Data":"1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3"} Apr 23 17:05:44.909035 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.908846 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" event={"ID":"a9350d16-cbd5-49a9-a075-8fa3a8cfed3d","Type":"ContainerDied","Data":"d73cf69d19c7c8d937d2d35e427675b9aebc79148b8650e4298adb83e940655a"} Apr 23 17:05:44.909035 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.908867 2561 scope.go:117] "RemoveContainer" containerID="1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3" Apr 23 17:05:44.909035 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.908868 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r" Apr 23 17:05:44.918753 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.918723 2561 scope.go:117] "RemoveContainer" containerID="1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3" Apr 23 17:05:44.919151 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:05:44.919127 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3\": container with ID starting with 1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3 not found: ID does not exist" containerID="1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3" Apr 23 17:05:44.919245 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.919163 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3"} err="failed to get container status \"1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3\": rpc error: code = NotFound desc = could not find container \"1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3\": container with ID starting with 1db78ad5aa15cde1884804696d29b27d5fe095ffa641ba925690f5d13c19c8c3 not found: ID does not exist" Apr 23 17:05:44.930978 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.930956 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r"] Apr 23 17:05:44.932890 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:44.932851 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3ffea-predictor-5bfd58598f-x8f9r"] Apr 23 17:05:46.031464 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:46.031435 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" path="/var/lib/kubelet/pods/a9350d16-cbd5-49a9-a075-8fa3a8cfed3d/volumes" Apr 23 17:05:49.641354 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:49.641314 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:05:52.903712 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:52.903669 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 17:05:54.641705 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:54.641670 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:05:54.642056 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:54.641768 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 17:05:59.640756 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:05:59.640717 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:02.903265 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:02.903222 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 17:06:04.641772 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:04.641738 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:09.641788 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:09.641752 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:11.594426 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.594382 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 17:06:11.689340 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.689308 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls\") pod \"30aae997-54b9-46f6-8ff7-17f498f7bc77\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " Apr 23 17:06:11.689340 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.689343 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30aae997-54b9-46f6-8ff7-17f498f7bc77-openshift-service-ca-bundle\") pod \"30aae997-54b9-46f6-8ff7-17f498f7bc77\" (UID: \"30aae997-54b9-46f6-8ff7-17f498f7bc77\") " Apr 23 17:06:11.689681 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.689662 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30aae997-54b9-46f6-8ff7-17f498f7bc77-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "30aae997-54b9-46f6-8ff7-17f498f7bc77" (UID: "30aae997-54b9-46f6-8ff7-17f498f7bc77"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:06:11.691467 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.691436 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "30aae997-54b9-46f6-8ff7-17f498f7bc77" (UID: "30aae997-54b9-46f6-8ff7-17f498f7bc77"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:06:11.790558 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.790534 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30aae997-54b9-46f6-8ff7-17f498f7bc77-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:06:11.790558 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.790558 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30aae997-54b9-46f6-8ff7-17f498f7bc77-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:06:11.980424 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.980393 2561 generic.go:358] "Generic (PLEG): container finished" podID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerID="5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100" exitCode=0 Apr 23 17:06:11.980561 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.980450 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" Apr 23 17:06:11.980561 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.980449 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" event={"ID":"30aae997-54b9-46f6-8ff7-17f498f7bc77","Type":"ContainerDied","Data":"5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100"} Apr 23 17:06:11.980561 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.980549 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7" event={"ID":"30aae997-54b9-46f6-8ff7-17f498f7bc77","Type":"ContainerDied","Data":"e10ab99a25a24663b58d5223994ad9d1427101e7db084d6f12fb32253128aca3"} Apr 23 17:06:11.980677 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.980565 2561 scope.go:117] "RemoveContainer" containerID="5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100" Apr 23 17:06:11.989394 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.989377 2561 scope.go:117] "RemoveContainer" containerID="5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100" Apr 23 17:06:11.989636 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:06:11.989621 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100\": container with ID starting with 5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100 not found: ID does not exist" containerID="5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100" Apr 23 17:06:11.989678 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:11.989643 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100"} err="failed to get container status \"5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100\": rpc error: code = NotFound desc = could not find container \"5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100\": container with ID starting with 5d40dcc4afb92cc0e4db00f0a45aae1afa84ec2bd1e6598869631dc8c24b6100 not found: ID does not exist" Apr 23 17:06:12.001467 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:12.001446 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7"] Apr 23 17:06:12.004255 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:12.004237 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-3ffea-cfb579db8-9spn7"] Apr 23 17:06:12.031463 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:12.031441 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" path="/var/lib/kubelet/pods/30aae997-54b9-46f6-8ff7-17f498f7bc77/volumes" Apr 23 17:06:12.903751 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:12.903702 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 17:06:21.211254 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.211222 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh"] Apr 23 17:06:21.211617 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.211458 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" containerID="cri-o://792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68" gracePeriod=30 Apr 23 17:06:21.331134 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.331103 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds"] Apr 23 17:06:21.331399 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.331375 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" containerID="cri-o://5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991" gracePeriod=30 Apr 23 17:06:21.344094 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.344069 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk"] Apr 23 17:06:21.344314 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.344303 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" Apr 23 17:06:21.344355 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.344316 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" Apr 23 17:06:21.344355 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.344336 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" Apr 23 17:06:21.344355 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.344341 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" Apr 23 17:06:21.344448 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.344384 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="30aae997-54b9-46f6-8ff7-17f498f7bc77" containerName="ensemble-graph-3ffea" Apr 23 17:06:21.344448 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.344395 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9350d16-cbd5-49a9-a075-8fa3a8cfed3d" containerName="kserve-container" Apr 23 17:06:21.348407 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.348391 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" Apr 23 17:06:21.354452 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.354421 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk"] Apr 23 17:06:21.358428 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.358411 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" Apr 23 17:06:21.486252 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:21.486180 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk"] Apr 23 17:06:21.489311 ip-10-0-141-189 kubenswrapper[2561]: W0423 17:06:21.489277 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode30c495e_2209_4513_951e_8ca949dd4097.slice/crio-e90122ae84256aa462a0407c1fd3cebf22ff7758f062b064184002ff173f86a9 WatchSource:0}: Error finding container e90122ae84256aa462a0407c1fd3cebf22ff7758f062b064184002ff173f86a9: Status 404 returned error can't find the container with id e90122ae84256aa462a0407c1fd3cebf22ff7758f062b064184002ff173f86a9 Apr 23 17:06:22.012916 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:22.012886 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" event={"ID":"e30c495e-2209-4513-951e-8ca949dd4097","Type":"ContainerStarted","Data":"acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca"} Apr 23 17:06:22.012916 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:22.012919 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" event={"ID":"e30c495e-2209-4513-951e-8ca949dd4097","Type":"ContainerStarted","Data":"e90122ae84256aa462a0407c1fd3cebf22ff7758f062b064184002ff173f86a9"} Apr 23 17:06:22.013158 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:22.013010 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" Apr 23 17:06:22.014288 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:22.014261 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 17:06:22.034130 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:22.034087 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podStartSLOduration=1.034072874 podStartE2EDuration="1.034072874s" podCreationTimestamp="2026-04-23 17:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:06:22.033475264 +0000 UTC m=+1876.586909433" watchObservedRunningTime="2026-04-23 17:06:22.034072874 +0000 UTC m=+1876.587507044" Apr 23 17:06:22.903401 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:22.903361 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 17:06:23.016059 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:23.016022 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 17:06:23.749481 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:23.749440 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:24.172444 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:24.172424 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" Apr 23 17:06:25.021791 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.021708 2561 generic.go:358] "Generic (PLEG): container finished" podID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerID="5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991" exitCode=0 Apr 23 17:06:25.021791 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.021751 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" event={"ID":"d948d245-4a2d-49b9-a873-32c041bc5cc0","Type":"ContainerDied","Data":"5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991"} Apr 23 17:06:25.021791 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.021769 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" Apr 23 17:06:25.021791 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.021782 2561 scope.go:117] "RemoveContainer" containerID="5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991" Apr 23 17:06:25.022083 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.021772 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds" event={"ID":"d948d245-4a2d-49b9-a873-32c041bc5cc0","Type":"ContainerDied","Data":"50e5d9cf1688015cb6fad899cda56d223cb30ddf46cfc96a82be5da62289aefb"} Apr 23 17:06:25.029463 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.029439 2561 scope.go:117] "RemoveContainer" containerID="5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991" Apr 23 17:06:25.029716 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:06:25.029699 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991\": container with ID starting with 5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991 not found: ID does not exist" containerID="5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991" Apr 23 17:06:25.029779 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.029722 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991"} err="failed to get container status \"5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991\": rpc error: code = NotFound desc = could not find container \"5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991\": container with ID starting with 5aa2de7e654081432476c634ae9260711502120576ce13ad3dadff31d1d63991 not found: ID does not exist" Apr 23 17:06:25.040663 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.040642 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds"] Apr 23 17:06:25.044380 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:25.044360 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a7ecd-predictor-76b497db67-9wwds"] Apr 23 17:06:26.030722 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:26.030681 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" path="/var/lib/kubelet/pods/d948d245-4a2d-49b9-a873-32c041bc5cc0/volumes" Apr 23 17:06:28.749688 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:28.749643 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:32.904273 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:32.904243 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" Apr 23 17:06:33.016985 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:33.016950 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 17:06:33.749671 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:33.749633 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:33.749864 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:33.749729 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 17:06:38.749694 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:38.749647 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:43.016713 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:43.016671 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 17:06:43.749571 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:43.749535 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:48.748733 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:48.748690 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:06:51.350619 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.350598 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 17:06:51.355058 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.355040 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13a91d7b-802c-46b5-855c-48425a6dc410-openshift-service-ca-bundle\") pod \"13a91d7b-802c-46b5-855c-48425a6dc410\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " Apr 23 17:06:51.355149 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.355070 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls\") pod \"13a91d7b-802c-46b5-855c-48425a6dc410\" (UID: \"13a91d7b-802c-46b5-855c-48425a6dc410\") " Apr 23 17:06:51.355365 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.355345 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a91d7b-802c-46b5-855c-48425a6dc410-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "13a91d7b-802c-46b5-855c-48425a6dc410" (UID: "13a91d7b-802c-46b5-855c-48425a6dc410"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:06:51.356942 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.356920 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "13a91d7b-802c-46b5-855c-48425a6dc410" (UID: "13a91d7b-802c-46b5-855c-48425a6dc410"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:06:51.455947 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.455927 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13a91d7b-802c-46b5-855c-48425a6dc410-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:06:51.455947 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.455947 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13a91d7b-802c-46b5-855c-48425a6dc410-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:06:51.680624 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.680598 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt"] Apr 23 17:06:51.680848 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.680837 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" Apr 23 17:06:51.680906 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.680849 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" Apr 23 17:06:51.680906 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.680859 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" Apr 23 17:06:51.680906 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.680866 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" Apr 23 17:06:51.681082 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.680925 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="d948d245-4a2d-49b9-a873-32c041bc5cc0" containerName="kserve-container" Apr 23 17:06:51.681082 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.680936 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" containerName="sequence-graph-a7ecd" Apr 23 17:06:51.683971 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.683953 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:51.686502 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.686479 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-56812-kube-rbac-proxy-sar-config\"" Apr 23 17:06:51.686614 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.686499 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-56812-serving-cert\"" Apr 23 17:06:51.691796 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.691776 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt"] Apr 23 17:06:51.758247 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.758221 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1041e52a-41ef-4382-8640-baf093084d81-openshift-service-ca-bundle\") pod \"splitter-graph-56812-687c6bbdcf-mjdzt\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:51.758361 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.758256 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls\") pod \"splitter-graph-56812-687c6bbdcf-mjdzt\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:51.859139 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.859112 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1041e52a-41ef-4382-8640-baf093084d81-openshift-service-ca-bundle\") pod \"splitter-graph-56812-687c6bbdcf-mjdzt\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:51.859291 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.859149 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls\") pod \"splitter-graph-56812-687c6bbdcf-mjdzt\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:51.859291 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:06:51.859264 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-56812-serving-cert: secret "splitter-graph-56812-serving-cert" not found Apr 23 17:06:51.859372 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:06:51.859330 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls podName:1041e52a-41ef-4382-8640-baf093084d81 nodeName:}" failed. No retries permitted until 2026-04-23 17:06:52.359313625 +0000 UTC m=+1906.912747772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls") pod "splitter-graph-56812-687c6bbdcf-mjdzt" (UID: "1041e52a-41ef-4382-8640-baf093084d81") : secret "splitter-graph-56812-serving-cert" not found Apr 23 17:06:51.859716 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:51.859698 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1041e52a-41ef-4382-8640-baf093084d81-openshift-service-ca-bundle\") pod \"splitter-graph-56812-687c6bbdcf-mjdzt\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:52.089893 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.089844 2561 generic.go:358] "Generic (PLEG): container finished" podID="13a91d7b-802c-46b5-855c-48425a6dc410" containerID="792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68" exitCode=137 Apr 23 17:06:52.090033 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.089912 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" event={"ID":"13a91d7b-802c-46b5-855c-48425a6dc410","Type":"ContainerDied","Data":"792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68"} Apr 23 17:06:52.090033 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.089963 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" event={"ID":"13a91d7b-802c-46b5-855c-48425a6dc410","Type":"ContainerDied","Data":"60751a7a8e3f4acbd53136e6551ebfb3928070fafea8e0200a69f2d2f0ce9603"} Apr 23 17:06:52.090033 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.089979 2561 scope.go:117] "RemoveContainer" containerID="792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68" Apr 23 17:06:52.090033 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.089933 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh" Apr 23 17:06:52.097302 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.097284 2561 scope.go:117] "RemoveContainer" containerID="792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68" Apr 23 17:06:52.097549 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:06:52.097530 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68\": container with ID starting with 792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68 not found: ID does not exist" containerID="792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68" Apr 23 17:06:52.097607 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.097558 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68"} err="failed to get container status \"792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68\": rpc error: code = NotFound desc = could not find container \"792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68\": container with ID starting with 792a204803745e76ff636bb8b64ab23d3c6b9941cc1f97de621241efae0fbd68 not found: ID does not exist" Apr 23 17:06:52.106024 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.106002 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh"] Apr 23 17:06:52.107889 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.107854 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a7ecd-78b4fbf76f-gbcrh"] Apr 23 17:06:52.362932 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.362840 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls\") pod \"splitter-graph-56812-687c6bbdcf-mjdzt\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:52.365064 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.365046 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls\") pod \"splitter-graph-56812-687c6bbdcf-mjdzt\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:52.594463 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.594431 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:52.708821 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:52.708792 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt"] Apr 23 17:06:52.712288 ip-10-0-141-189 kubenswrapper[2561]: W0423 17:06:52.712262 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1041e52a_41ef_4382_8640_baf093084d81.slice/crio-d9881e43465053bb54fcd56be14eab517e4e0d3e673bc76dd8b150f15246ce48 WatchSource:0}: Error finding container d9881e43465053bb54fcd56be14eab517e4e0d3e673bc76dd8b150f15246ce48: Status 404 returned error can't find the container with id d9881e43465053bb54fcd56be14eab517e4e0d3e673bc76dd8b150f15246ce48 Apr 23 17:06:53.016867 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:53.016785 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 17:06:53.093995 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:53.093956 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" event={"ID":"1041e52a-41ef-4382-8640-baf093084d81","Type":"ContainerStarted","Data":"b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4"} Apr 23 17:06:53.093995 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:53.093994 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" event={"ID":"1041e52a-41ef-4382-8640-baf093084d81","Type":"ContainerStarted","Data":"d9881e43465053bb54fcd56be14eab517e4e0d3e673bc76dd8b150f15246ce48"} Apr 23 17:06:53.094208 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:53.094164 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:06:53.111890 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:53.111832 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" podStartSLOduration=2.111819396 podStartE2EDuration="2.111819396s" podCreationTimestamp="2026-04-23 17:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:06:53.11077804 +0000 UTC m=+1907.664212223" watchObservedRunningTime="2026-04-23 17:06:53.111819396 +0000 UTC m=+1907.665253565" Apr 23 17:06:54.031586 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:54.031550 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a91d7b-802c-46b5-855c-48425a6dc410" path="/var/lib/kubelet/pods/13a91d7b-802c-46b5-855c-48425a6dc410/volumes" Apr 23 17:06:59.103024 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:06:59.102997 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:07:01.753959 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:01.753921 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt"] Apr 23 17:07:01.754343 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:01.754151 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" containerID="cri-o://b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4" gracePeriod=30 Apr 23 17:07:01.890514 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:01.890482 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc"] Apr 23 17:07:01.893955 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:01.893934 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" Apr 23 17:07:01.899198 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:01.899171 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7"] Apr 23 17:07:01.899411 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:01.899386 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" containerID="cri-o://7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033" gracePeriod=30 Apr 23 17:07:01.902724 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:01.902702 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc"] Apr 23 17:07:01.904809 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:01.904791 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" Apr 23 17:07:02.030663 ip-10-0-141-189 kubenswrapper[2561]: W0423 17:07:02.030632 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6780ca19_3b6b_4ac5_98c0_2d25888afd97.slice/crio-caa06229cce562dc42162ba0a3338dcade916de5535224e2c9cb6ddefde44786 WatchSource:0}: Error finding container caa06229cce562dc42162ba0a3338dcade916de5535224e2c9cb6ddefde44786: Status 404 returned error can't find the container with id caa06229cce562dc42162ba0a3338dcade916de5535224e2c9cb6ddefde44786 Apr 23 17:07:02.031404 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:02.031378 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc"] Apr 23 17:07:02.116364 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:02.116332 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" event={"ID":"6780ca19-3b6b-4ac5-98c0-2d25888afd97","Type":"ContainerStarted","Data":"57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c"} Apr 23 17:07:02.116457 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:02.116371 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" event={"ID":"6780ca19-3b6b-4ac5-98c0-2d25888afd97","Type":"ContainerStarted","Data":"caa06229cce562dc42162ba0a3338dcade916de5535224e2c9cb6ddefde44786"} Apr 23 17:07:02.116557 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:02.116540 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" Apr 23 17:07:02.117972 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:02.117949 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 17:07:02.131583 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:02.131543 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" podStartSLOduration=1.131533046 podStartE2EDuration="1.131533046s" podCreationTimestamp="2026-04-23 17:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:07:02.130377209 +0000 UTC m=+1916.683811377" watchObservedRunningTime="2026-04-23 17:07:02.131533046 +0000 UTC m=+1916.684967242" Apr 23 17:07:02.903483 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:02.903446 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 17:07:03.016565 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:03.016522 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 17:07:03.121370 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:03.121335 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 17:07:04.101253 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:04.101216 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:07:04.741040 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:04.741017 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" Apr 23 17:07:05.128036 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.128007 2561 generic.go:358] "Generic (PLEG): container finished" podID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerID="7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033" exitCode=0 Apr 23 17:07:05.128438 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.128042 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" event={"ID":"4da01992-da60-469a-8bb9-bd54a37f9e30","Type":"ContainerDied","Data":"7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033"} Apr 23 17:07:05.128438 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.128064 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" event={"ID":"4da01992-da60-469a-8bb9-bd54a37f9e30","Type":"ContainerDied","Data":"2828f392f274306ef845a4ce69315bdde5d627ac9ef5de55abbdc58ff024a450"} Apr 23 17:07:05.128438 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.128066 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7" Apr 23 17:07:05.128438 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.128079 2561 scope.go:117] "RemoveContainer" containerID="7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033" Apr 23 17:07:05.135704 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.135677 2561 scope.go:117] "RemoveContainer" containerID="7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033" Apr 23 17:07:05.136154 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:07:05.136133 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033\": container with ID starting with 7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033 not found: ID does not exist" containerID="7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033" Apr 23 17:07:05.136220 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.136163 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033"} err="failed to get container status \"7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033\": rpc error: code = NotFound desc = could not find container \"7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033\": container with ID starting with 7d73ccb525bf062b11f1a6a5e2f2170231002e020aec4af46beaac6266251033 not found: ID does not exist" Apr 23 17:07:05.147533 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.147491 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7"] Apr 23 17:07:05.148746 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:05.148723 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56812-predictor-5dc6bf65f5-tpwn7"] Apr 23 17:07:06.031219 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:06.031190 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" path="/var/lib/kubelet/pods/4da01992-da60-469a-8bb9-bd54a37f9e30/volumes" Apr 23 17:07:09.101705 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:09.101667 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:07:13.017049 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:13.017018 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" Apr 23 17:07:13.122284 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:13.122247 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 17:07:14.101158 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:14.101106 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:07:14.101527 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:14.101225 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:07:19.101367 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:19.101324 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:07:23.122129 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:23.122091 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 17:07:24.101868 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:24.101823 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:07:29.101583 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:29.101548 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:07:31.427517 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.427483 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46"] Apr 23 17:07:31.428004 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.427733 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" Apr 23 17:07:31.428004 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.427743 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" Apr 23 17:07:31.428004 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.427800 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="4da01992-da60-469a-8bb9-bd54a37f9e30" containerName="kserve-container" Apr 23 17:07:31.430608 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.430591 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:31.433103 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.433080 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-390fe-serving-cert\"" Apr 23 17:07:31.433103 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.433092 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-390fe-kube-rbac-proxy-sar-config\"" Apr 23 17:07:31.439054 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.439035 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46"] Apr 23 17:07:31.524321 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.524289 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-openshift-service-ca-bundle\") pod \"switch-graph-390fe-857467bf76-rgb46\" (UID: \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\") " pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:31.524482 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.524345 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-proxy-tls\") pod \"switch-graph-390fe-857467bf76-rgb46\" (UID: \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\") " pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:31.625064 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.625032 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-openshift-service-ca-bundle\") pod \"switch-graph-390fe-857467bf76-rgb46\" (UID: \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\") " pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:31.625222 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.625080 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-proxy-tls\") pod \"switch-graph-390fe-857467bf76-rgb46\" (UID: \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\") " pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:31.625638 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.625621 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-openshift-service-ca-bundle\") pod \"switch-graph-390fe-857467bf76-rgb46\" (UID: \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\") " pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:31.627259 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.627242 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-proxy-tls\") pod \"switch-graph-390fe-857467bf76-rgb46\" (UID: \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\") " pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:31.741020 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.740942 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:31.813109 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:07:31.812409 2561 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1041e52a_41ef_4382_8640_baf093084d81.slice/crio-conmon-b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1041e52a_41ef_4382_8640_baf093084d81.slice/crio-b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4.scope\": RecentStats: unable to find data in memory cache]" Apr 23 17:07:31.814304 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:07:31.812519 2561 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1041e52a_41ef_4382_8640_baf093084d81.slice/crio-d9881e43465053bb54fcd56be14eab517e4e0d3e673bc76dd8b150f15246ce48\": RecentStats: unable to find data in memory cache]" Apr 23 17:07:31.814462 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:07:31.812619 2561 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1041e52a_41ef_4382_8640_baf093084d81.slice/crio-d9881e43465053bb54fcd56be14eab517e4e0d3e673bc76dd8b150f15246ce48\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1041e52a_41ef_4382_8640_baf093084d81.slice/crio-conmon-b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1041e52a_41ef_4382_8640_baf093084d81.slice/crio-b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4.scope\": RecentStats: unable to find data in memory cache]" Apr 23 17:07:31.868045 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.868020 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46"] Apr 23 17:07:31.870142 ip-10-0-141-189 kubenswrapper[2561]: W0423 17:07:31.870110 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598bcef5_3c35_41c3_8ea3_8d94f8cb2782.slice/crio-9c9941c438779b888d88a41e125cea6ef05289470962711908c40bbf8957f9d2 WatchSource:0}: Error finding container 9c9941c438779b888d88a41e125cea6ef05289470962711908c40bbf8957f9d2: Status 404 returned error can't find the container with id 9c9941c438779b888d88a41e125cea6ef05289470962711908c40bbf8957f9d2 Apr 23 17:07:31.926713 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:31.926691 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:07:32.028358 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.028335 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls\") pod \"1041e52a-41ef-4382-8640-baf093084d81\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " Apr 23 17:07:32.028496 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.028388 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1041e52a-41ef-4382-8640-baf093084d81-openshift-service-ca-bundle\") pod \"1041e52a-41ef-4382-8640-baf093084d81\" (UID: \"1041e52a-41ef-4382-8640-baf093084d81\") " Apr 23 17:07:32.028752 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.028719 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1041e52a-41ef-4382-8640-baf093084d81-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "1041e52a-41ef-4382-8640-baf093084d81" (UID: "1041e52a-41ef-4382-8640-baf093084d81"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:07:32.030345 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.030317 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1041e52a-41ef-4382-8640-baf093084d81" (UID: "1041e52a-41ef-4382-8640-baf093084d81"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:07:32.128985 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.128955 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1041e52a-41ef-4382-8640-baf093084d81-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:07:32.128985 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.128977 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1041e52a-41ef-4382-8640-baf093084d81-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:07:32.199833 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.199803 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" event={"ID":"598bcef5-3c35-41c3-8ea3-8d94f8cb2782","Type":"ContainerStarted","Data":"e9671fd5f89a4287783e36b7388fec721dd0d2b5df9f1789ba645e2152ce80bc"} Apr 23 17:07:32.200005 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.199842 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" event={"ID":"598bcef5-3c35-41c3-8ea3-8d94f8cb2782","Type":"ContainerStarted","Data":"9c9941c438779b888d88a41e125cea6ef05289470962711908c40bbf8957f9d2"} Apr 23 17:07:32.200005 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.199920 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:32.200748 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.200721 2561 generic.go:358] "Generic (PLEG): container finished" podID="1041e52a-41ef-4382-8640-baf093084d81" containerID="b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4" exitCode=0 Apr 23 17:07:32.200826 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.200771 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" Apr 23 17:07:32.200826 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.200797 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" event={"ID":"1041e52a-41ef-4382-8640-baf093084d81","Type":"ContainerDied","Data":"b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4"} Apr 23 17:07:32.200826 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.200822 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt" event={"ID":"1041e52a-41ef-4382-8640-baf093084d81","Type":"ContainerDied","Data":"d9881e43465053bb54fcd56be14eab517e4e0d3e673bc76dd8b150f15246ce48"} Apr 23 17:07:32.200953 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.200838 2561 scope.go:117] "RemoveContainer" containerID="b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4" Apr 23 17:07:32.208331 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.208316 2561 scope.go:117] "RemoveContainer" containerID="b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4" Apr 23 17:07:32.208582 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:07:32.208564 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4\": container with ID starting with b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4 not found: ID does not exist" containerID="b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4" Apr 23 17:07:32.208675 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.208588 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4"} err="failed to get container status \"b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4\": rpc error: code = NotFound desc = could not find container \"b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4\": container with ID starting with b4d668d179fbffec2c826c1525d0ddd12a9f1be0b1b193f13733b8375d8a9ac4 not found: ID does not exist" Apr 23 17:07:32.215656 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.215622 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" podStartSLOduration=1.2156120910000001 podStartE2EDuration="1.215612091s" podCreationTimestamp="2026-04-23 17:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:07:32.214228933 +0000 UTC m=+1946.767663102" watchObservedRunningTime="2026-04-23 17:07:32.215612091 +0000 UTC m=+1946.769046259" Apr 23 17:07:32.225498 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.225474 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt"] Apr 23 17:07:32.229061 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:32.229038 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56812-687c6bbdcf-mjdzt"] Apr 23 17:07:33.121390 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:33.121355 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 17:07:34.031696 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:34.031665 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1041e52a-41ef-4382-8640-baf093084d81" path="/var/lib/kubelet/pods/1041e52a-41ef-4382-8640-baf093084d81/volumes" Apr 23 17:07:38.211704 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:38.211675 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:07:43.122014 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:43.121976 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 17:07:53.122168 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:07:53.122136 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" Apr 23 17:08:11.962968 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:11.962932 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m"] Apr 23 17:08:11.963348 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:11.963178 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" Apr 23 17:08:11.963348 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:11.963188 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" Apr 23 17:08:11.963348 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:11.963238 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="1041e52a-41ef-4382-8640-baf093084d81" containerName="splitter-graph-56812" Apr 23 17:08:11.967235 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:11.967215 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:11.969838 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:11.969821 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-5bc06-serving-cert\"" Apr 23 17:08:11.969932 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:11.969824 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-5bc06-kube-rbac-proxy-sar-config\"" Apr 23 17:08:11.975521 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:11.975500 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m"] Apr 23 17:08:12.084968 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:12.084933 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03838bdb-892d-428b-b289-8fff306cda58-openshift-service-ca-bundle\") pod \"splitter-graph-5bc06-756cc58f5d-jkc9m\" (UID: \"03838bdb-892d-428b-b289-8fff306cda58\") " pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:12.085142 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:12.084983 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03838bdb-892d-428b-b289-8fff306cda58-proxy-tls\") pod \"splitter-graph-5bc06-756cc58f5d-jkc9m\" (UID: \"03838bdb-892d-428b-b289-8fff306cda58\") " pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:12.186321 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:12.186282 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03838bdb-892d-428b-b289-8fff306cda58-openshift-service-ca-bundle\") pod \"splitter-graph-5bc06-756cc58f5d-jkc9m\" (UID: \"03838bdb-892d-428b-b289-8fff306cda58\") " pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:12.186488 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:12.186339 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03838bdb-892d-428b-b289-8fff306cda58-proxy-tls\") pod \"splitter-graph-5bc06-756cc58f5d-jkc9m\" (UID: \"03838bdb-892d-428b-b289-8fff306cda58\") " pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:12.186940 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:12.186910 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03838bdb-892d-428b-b289-8fff306cda58-openshift-service-ca-bundle\") pod \"splitter-graph-5bc06-756cc58f5d-jkc9m\" (UID: \"03838bdb-892d-428b-b289-8fff306cda58\") " pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:12.188628 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:12.188607 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03838bdb-892d-428b-b289-8fff306cda58-proxy-tls\") pod \"splitter-graph-5bc06-756cc58f5d-jkc9m\" (UID: \"03838bdb-892d-428b-b289-8fff306cda58\") " pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:12.277763 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:12.277678 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:12.388927 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:12.388903 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m"] Apr 23 17:08:12.391260 ip-10-0-141-189 kubenswrapper[2561]: W0423 17:08:12.391235 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03838bdb_892d_428b_b289_8fff306cda58.slice/crio-3b3281c627aa49becfae6cdd634745f6a0f502cd59fe320fd5309bc0f78a856d WatchSource:0}: Error finding container 3b3281c627aa49becfae6cdd634745f6a0f502cd59fe320fd5309bc0f78a856d: Status 404 returned error can't find the container with id 3b3281c627aa49becfae6cdd634745f6a0f502cd59fe320fd5309bc0f78a856d Apr 23 17:08:13.310967 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:13.310932 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" event={"ID":"03838bdb-892d-428b-b289-8fff306cda58","Type":"ContainerStarted","Data":"f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f"} Apr 23 17:08:13.310967 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:13.310972 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" event={"ID":"03838bdb-892d-428b-b289-8fff306cda58","Type":"ContainerStarted","Data":"3b3281c627aa49becfae6cdd634745f6a0f502cd59fe320fd5309bc0f78a856d"} Apr 23 17:08:13.311382 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:13.311067 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:08:13.327026 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:13.326976 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" podStartSLOduration=2.326960433 podStartE2EDuration="2.326960433s" podCreationTimestamp="2026-04-23 17:08:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:08:13.326926445 +0000 UTC m=+1987.880360615" watchObservedRunningTime="2026-04-23 17:08:13.326960433 +0000 UTC m=+1987.880394606" Apr 23 17:08:19.319678 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:08:19.319650 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:16:26.724253 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:26.724219 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m"] Apr 23 17:16:26.726793 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:26.724520 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" containerID="cri-o://f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f" gracePeriod=30 Apr 23 17:16:26.814614 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:26.814582 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc"] Apr 23 17:16:26.814856 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:26.814833 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" containerID="cri-o://57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c" gracePeriod=30 Apr 23 17:16:29.243158 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.243138 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" Apr 23 17:16:29.317853 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.317810 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:16:29.573246 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.573213 2561 generic.go:358] "Generic (PLEG): container finished" podID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerID="57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c" exitCode=0 Apr 23 17:16:29.573384 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.573260 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" event={"ID":"6780ca19-3b6b-4ac5-98c0-2d25888afd97","Type":"ContainerDied","Data":"57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c"} Apr 23 17:16:29.573384 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.573271 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" Apr 23 17:16:29.573384 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.573287 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc" event={"ID":"6780ca19-3b6b-4ac5-98c0-2d25888afd97","Type":"ContainerDied","Data":"caa06229cce562dc42162ba0a3338dcade916de5535224e2c9cb6ddefde44786"} Apr 23 17:16:29.573384 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.573305 2561 scope.go:117] "RemoveContainer" containerID="57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c" Apr 23 17:16:29.580650 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.580635 2561 scope.go:117] "RemoveContainer" containerID="57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c" Apr 23 17:16:29.580917 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:16:29.580896 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c\": container with ID starting with 57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c not found: ID does not exist" containerID="57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c" Apr 23 17:16:29.580987 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.580926 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c"} err="failed to get container status \"57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c\": rpc error: code = NotFound desc = could not find container \"57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c\": container with ID starting with 57bb7d3d2d47fb8e68594aaefa6b453d4a348b1858e501c38fadb533836c124c not found: ID does not exist" Apr 23 17:16:29.592354 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.592337 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc"] Apr 23 17:16:29.596178 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:29.596158 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5bc06-predictor-77cbd75659-95hdc"] Apr 23 17:16:30.030744 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:30.030717 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" path="/var/lib/kubelet/pods/6780ca19-3b6b-4ac5-98c0-2d25888afd97/volumes" Apr 23 17:16:34.318287 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:34.318242 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:16:39.317589 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:39.317550 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:16:39.317953 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:39.317651 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:16:44.317432 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:44.317389 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:16:49.317766 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:49.317724 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:16:54.317708 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:54.317671 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:16:56.851357 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:56.851337 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:16:56.974800 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:56.974772 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03838bdb-892d-428b-b289-8fff306cda58-proxy-tls\") pod \"03838bdb-892d-428b-b289-8fff306cda58\" (UID: \"03838bdb-892d-428b-b289-8fff306cda58\") " Apr 23 17:16:56.974983 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:56.974841 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03838bdb-892d-428b-b289-8fff306cda58-openshift-service-ca-bundle\") pod \"03838bdb-892d-428b-b289-8fff306cda58\" (UID: \"03838bdb-892d-428b-b289-8fff306cda58\") " Apr 23 17:16:56.975196 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:56.975168 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03838bdb-892d-428b-b289-8fff306cda58-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "03838bdb-892d-428b-b289-8fff306cda58" (UID: "03838bdb-892d-428b-b289-8fff306cda58"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:16:56.976723 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:56.976703 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03838bdb-892d-428b-b289-8fff306cda58-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "03838bdb-892d-428b-b289-8fff306cda58" (UID: "03838bdb-892d-428b-b289-8fff306cda58"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:16:57.075392 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.075364 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/03838bdb-892d-428b-b289-8fff306cda58-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:16:57.075392 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.075392 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03838bdb-892d-428b-b289-8fff306cda58-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:16:57.646383 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.646350 2561 generic.go:358] "Generic (PLEG): container finished" podID="03838bdb-892d-428b-b289-8fff306cda58" containerID="f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f" exitCode=0 Apr 23 17:16:57.646706 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.646416 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" event={"ID":"03838bdb-892d-428b-b289-8fff306cda58","Type":"ContainerDied","Data":"f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f"} Apr 23 17:16:57.646706 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.646419 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" Apr 23 17:16:57.646706 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.646440 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m" event={"ID":"03838bdb-892d-428b-b289-8fff306cda58","Type":"ContainerDied","Data":"3b3281c627aa49becfae6cdd634745f6a0f502cd59fe320fd5309bc0f78a856d"} Apr 23 17:16:57.646706 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.646454 2561 scope.go:117] "RemoveContainer" containerID="f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f" Apr 23 17:16:57.654762 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.654738 2561 scope.go:117] "RemoveContainer" containerID="f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f" Apr 23 17:16:57.655042 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:16:57.655020 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f\": container with ID starting with f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f not found: ID does not exist" containerID="f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f" Apr 23 17:16:57.655107 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.655051 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f"} err="failed to get container status \"f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f\": rpc error: code = NotFound desc = could not find container \"f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f\": container with ID starting with f5b81c7c88907678f3ed9db7f071a9781497201aad48563823e0eab12677a56f not found: ID does not exist" Apr 23 17:16:57.667864 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.667841 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m"] Apr 23 17:16:57.674209 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:57.674188 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5bc06-756cc58f5d-jkc9m"] Apr 23 17:16:58.031295 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:16:58.031266 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03838bdb-892d-428b-b289-8fff306cda58" path="/var/lib/kubelet/pods/03838bdb-892d-428b-b289-8fff306cda58/volumes" Apr 23 17:23:50.647130 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:50.647098 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46"] Apr 23 17:23:50.647590 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:50.647325 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" containerID="cri-o://e9671fd5f89a4287783e36b7388fec721dd0d2b5df9f1789ba645e2152ce80bc" gracePeriod=30 Apr 23 17:23:50.765754 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:50.765720 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk"] Apr 23 17:23:50.766094 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:50.766047 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" containerID="cri-o://acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca" gracePeriod=30 Apr 23 17:23:53.016413 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.016376 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 17:23:53.210479 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.210443 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:23:53.498212 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.498189 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" Apr 23 17:23:53.689561 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.689483 2561 generic.go:358] "Generic (PLEG): container finished" podID="e30c495e-2209-4513-951e-8ca949dd4097" containerID="acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca" exitCode=0 Apr 23 17:23:53.689561 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.689519 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" event={"ID":"e30c495e-2209-4513-951e-8ca949dd4097","Type":"ContainerDied","Data":"acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca"} Apr 23 17:23:53.689561 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.689534 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" Apr 23 17:23:53.689561 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.689550 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk" event={"ID":"e30c495e-2209-4513-951e-8ca949dd4097","Type":"ContainerDied","Data":"e90122ae84256aa462a0407c1fd3cebf22ff7758f062b064184002ff173f86a9"} Apr 23 17:23:53.689842 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.689572 2561 scope.go:117] "RemoveContainer" containerID="acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca" Apr 23 17:23:53.696827 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.696818 2561 scope.go:117] "RemoveContainer" containerID="acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca" Apr 23 17:23:53.697107 ip-10-0-141-189 kubenswrapper[2561]: E0423 17:23:53.697085 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca\": container with ID starting with acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca not found: ID does not exist" containerID="acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca" Apr 23 17:23:53.697214 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.697112 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca"} err="failed to get container status \"acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca\": rpc error: code = NotFound desc = could not find container \"acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca\": container with ID starting with acd31e149fec9c18eb68fa46fdc11f168d3bf4e7456126fb537b2326479e8bca not found: ID does not exist" Apr 23 17:23:53.712460 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.712436 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk"] Apr 23 17:23:53.714104 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:53.714085 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-390fe-predictor-75d8f85f6f-78jxk"] Apr 23 17:23:54.030990 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:54.030963 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30c495e-2209-4513-951e-8ca949dd4097" path="/var/lib/kubelet/pods/e30c495e-2209-4513-951e-8ca949dd4097/volumes" Apr 23 17:23:58.209864 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:23:58.209831 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:24:03.210114 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:03.210067 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:24:03.210500 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:03.210176 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:24:05.616444 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:05.616413 2561 ???:1] "http: TLS handshake error from 10.0.129.102:55952: EOF" Apr 23 17:24:05.619059 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:05.619038 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:06.362959 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:06.362930 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:07.102342 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:07.102313 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:07.827224 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:07.827195 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:08.211052 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:08.210964 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:24:08.555130 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:08.555102 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:09.271151 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:09.271120 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:09.985124 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:09.985094 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:10.718694 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:10.718667 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:11.452039 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:11.452011 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:12.167825 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:12.167798 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:12.888623 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:12.888588 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:13.210543 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:13.210458 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:24:13.645620 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:13.645596 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-390fe-857467bf76-rgb46_598bcef5-3c35-41c3-8ea3-8d94f8cb2782/switch-graph-390fe/0.log" Apr 23 17:24:16.144324 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144294 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jm5zz/must-gather-wvj2x"] Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144517 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144527 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144540 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144547 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144553 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144558 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144599 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="e30c495e-2209-4513-951e-8ca949dd4097" containerName="kserve-container" Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144608 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="03838bdb-892d-428b-b289-8fff306cda58" containerName="splitter-graph-5bc06" Apr 23 17:24:16.144666 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.144615 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="6780ca19-3b6b-4ac5-98c0-2d25888afd97" containerName="kserve-container" Apr 23 17:24:16.147322 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.147306 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jm5zz/must-gather-wvj2x" Apr 23 17:24:16.149772 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.149750 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jm5zz\"/\"default-dockercfg-rwbms\"" Apr 23 17:24:16.149772 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.149749 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jm5zz\"/\"kube-root-ca.crt\"" Apr 23 17:24:16.149929 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.149790 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jm5zz\"/\"openshift-service-ca.crt\"" Apr 23 17:24:16.154788 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.154768 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jm5zz/must-gather-wvj2x"] Apr 23 17:24:16.217089 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.217066 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5psv\" (UniqueName: \"kubernetes.io/projected/05756cbf-d6ca-4f63-a771-dc1e40306f52-kube-api-access-t5psv\") pod \"must-gather-wvj2x\" (UID: \"05756cbf-d6ca-4f63-a771-dc1e40306f52\") " pod="openshift-must-gather-jm5zz/must-gather-wvj2x" Apr 23 17:24:16.217172 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.217117 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05756cbf-d6ca-4f63-a771-dc1e40306f52-must-gather-output\") pod \"must-gather-wvj2x\" (UID: \"05756cbf-d6ca-4f63-a771-dc1e40306f52\") " pod="openshift-must-gather-jm5zz/must-gather-wvj2x" Apr 23 17:24:16.318065 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.318043 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5psv\" (UniqueName: \"kubernetes.io/projected/05756cbf-d6ca-4f63-a771-dc1e40306f52-kube-api-access-t5psv\") pod \"must-gather-wvj2x\" (UID: \"05756cbf-d6ca-4f63-a771-dc1e40306f52\") " pod="openshift-must-gather-jm5zz/must-gather-wvj2x" Apr 23 17:24:16.318145 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.318090 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05756cbf-d6ca-4f63-a771-dc1e40306f52-must-gather-output\") pod \"must-gather-wvj2x\" (UID: \"05756cbf-d6ca-4f63-a771-dc1e40306f52\") " pod="openshift-must-gather-jm5zz/must-gather-wvj2x" Apr 23 17:24:16.318354 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.318337 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/05756cbf-d6ca-4f63-a771-dc1e40306f52-must-gather-output\") pod \"must-gather-wvj2x\" (UID: \"05756cbf-d6ca-4f63-a771-dc1e40306f52\") " pod="openshift-must-gather-jm5zz/must-gather-wvj2x" Apr 23 17:24:16.325489 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.325470 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5psv\" (UniqueName: \"kubernetes.io/projected/05756cbf-d6ca-4f63-a771-dc1e40306f52-kube-api-access-t5psv\") pod \"must-gather-wvj2x\" (UID: \"05756cbf-d6ca-4f63-a771-dc1e40306f52\") " pod="openshift-must-gather-jm5zz/must-gather-wvj2x" Apr 23 17:24:16.456201 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.456153 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jm5zz/must-gather-wvj2x" Apr 23 17:24:16.568056 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.568022 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jm5zz/must-gather-wvj2x"] Apr 23 17:24:16.571922 ip-10-0-141-189 kubenswrapper[2561]: W0423 17:24:16.571868 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05756cbf_d6ca_4f63_a771_dc1e40306f52.slice/crio-d79438568e78dd6b3c5c4aea8390609e894ccec05a164f27b20c56c6994b2e80 WatchSource:0}: Error finding container d79438568e78dd6b3c5c4aea8390609e894ccec05a164f27b20c56c6994b2e80: Status 404 returned error can't find the container with id d79438568e78dd6b3c5c4aea8390609e894ccec05a164f27b20c56c6994b2e80 Apr 23 17:24:16.575915 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.575899 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:24:16.752679 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:16.752584 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/must-gather-wvj2x" event={"ID":"05756cbf-d6ca-4f63-a771-dc1e40306f52","Type":"ContainerStarted","Data":"d79438568e78dd6b3c5c4aea8390609e894ccec05a164f27b20c56c6994b2e80"} Apr 23 17:24:17.757496 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:17.757457 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/must-gather-wvj2x" event={"ID":"05756cbf-d6ca-4f63-a771-dc1e40306f52","Type":"ContainerStarted","Data":"5d5765ed0cd0dd3ad1f24690ee5c561b96a15cb59da25d8e726bb7e4396f22e9"} Apr 23 17:24:18.210737 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:18.210694 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:24:18.763452 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:18.763409 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/must-gather-wvj2x" event={"ID":"05756cbf-d6ca-4f63-a771-dc1e40306f52","Type":"ContainerStarted","Data":"6ff054cfd02b76de68b3e8a5732b1f9e8cfe28fb9912591c50562ca639d95690"} Apr 23 17:24:18.780493 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:18.780430 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jm5zz/must-gather-wvj2x" podStartSLOduration=1.8337285140000001 podStartE2EDuration="2.780412451s" podCreationTimestamp="2026-04-23 17:24:16 +0000 UTC" firstStartedPulling="2026-04-23 17:24:16.576032762 +0000 UTC m=+2951.129466911" lastFinishedPulling="2026-04-23 17:24:17.522716699 +0000 UTC m=+2952.076150848" observedRunningTime="2026-04-23 17:24:18.778227973 +0000 UTC m=+2953.331662142" watchObservedRunningTime="2026-04-23 17:24:18.780412451 +0000 UTC m=+2953.333846621" Apr 23 17:24:18.893275 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:18.893245 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ng5vn_867d699c-6a2c-41bb-b885-26b9cd952983/global-pull-secret-syncer/0.log" Apr 23 17:24:18.926393 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:18.926367 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8c5j2_51bf838a-6ae8-4ed3-bdd7-7040c25cac11/konnectivity-agent/0.log" Apr 23 17:24:19.043806 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:19.043781 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-189.ec2.internal_11fa0aa737c019b5fa81fed20955549f/haproxy/0.log" Apr 23 17:24:20.772376 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:20.772336 2561 generic.go:358] "Generic (PLEG): container finished" podID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerID="e9671fd5f89a4287783e36b7388fec721dd0d2b5df9f1789ba645e2152ce80bc" exitCode=137 Apr 23 17:24:20.772859 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:20.772454 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" event={"ID":"598bcef5-3c35-41c3-8ea3-8d94f8cb2782","Type":"ContainerDied","Data":"e9671fd5f89a4287783e36b7388fec721dd0d2b5df9f1789ba645e2152ce80bc"} Apr 23 17:24:21.419149 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.418818 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:24:21.473987 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.473576 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-proxy-tls\") pod \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\" (UID: \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\") " Apr 23 17:24:21.474596 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.474206 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-openshift-service-ca-bundle\") pod \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\" (UID: \"598bcef5-3c35-41c3-8ea3-8d94f8cb2782\") " Apr 23 17:24:21.474848 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.474814 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "598bcef5-3c35-41c3-8ea3-8d94f8cb2782" (UID: "598bcef5-3c35-41c3-8ea3-8d94f8cb2782"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:24:21.480330 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.480288 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "598bcef5-3c35-41c3-8ea3-8d94f8cb2782" (UID: "598bcef5-3c35-41c3-8ea3-8d94f8cb2782"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:24:21.577343 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.577276 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-proxy-tls\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:24:21.577343 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.577312 2561 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/598bcef5-3c35-41c3-8ea3-8d94f8cb2782-openshift-service-ca-bundle\") on node \"ip-10-0-141-189.ec2.internal\" DevicePath \"\"" Apr 23 17:24:21.780539 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.777244 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" event={"ID":"598bcef5-3c35-41c3-8ea3-8d94f8cb2782","Type":"ContainerDied","Data":"9c9941c438779b888d88a41e125cea6ef05289470962711908c40bbf8957f9d2"} Apr 23 17:24:21.780539 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.777296 2561 scope.go:117] "RemoveContainer" containerID="e9671fd5f89a4287783e36b7388fec721dd0d2b5df9f1789ba645e2152ce80bc" Apr 23 17:24:21.780539 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.777461 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46" Apr 23 17:24:21.812446 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.812396 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46"] Apr 23 17:24:21.812446 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:21.812449 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-390fe-857467bf76-rgb46"] Apr 23 17:24:22.034088 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:22.033999 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" path="/var/lib/kubelet/pods/598bcef5-3c35-41c3-8ea3-8d94f8cb2782/volumes" Apr 23 17:24:22.938555 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:22.938529 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h6lxb_93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a/node-exporter/0.log" Apr 23 17:24:22.959936 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:22.959905 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h6lxb_93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a/kube-rbac-proxy/0.log" Apr 23 17:24:22.980921 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:22.980887 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h6lxb_93f3717b-c6ae-4dc7-8bfc-cc35c24bf51a/init-textfile/0.log" Apr 23 17:24:26.193812 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.193720 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx"] Apr 23 17:24:26.194373 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.194118 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" Apr 23 17:24:26.194373 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.194139 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" Apr 23 17:24:26.194373 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.194214 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="598bcef5-3c35-41c3-8ea3-8d94f8cb2782" containerName="switch-graph-390fe" Apr 23 17:24:26.198234 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.198212 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.203591 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.203565 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx"] Apr 23 17:24:26.317091 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.317062 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-podres\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.317247 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.317101 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-lib-modules\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.317247 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.317126 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-sys\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.317247 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.317188 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-proc\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.317247 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.317224 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdnf\" (UniqueName: \"kubernetes.io/projected/af69fe19-1e44-4aca-95f4-4f012390cff5-kube-api-access-6jdnf\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.337183 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.337159 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdjpw_0f0e9f3d-ecd0-4e57-8ef1-447361404429/dns/0.log" Apr 23 17:24:26.353193 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.353173 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdjpw_0f0e9f3d-ecd0-4e57-8ef1-447361404429/kube-rbac-proxy/0.log" Apr 23 17:24:26.417942 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.417918 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdnf\" (UniqueName: \"kubernetes.io/projected/af69fe19-1e44-4aca-95f4-4f012390cff5-kube-api-access-6jdnf\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.418082 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.417986 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-podres\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.418082 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.418029 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-lib-modules\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.418082 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.418053 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-sys\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.418241 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.418086 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-proc\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.418241 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.418096 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-podres\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.418241 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.418151 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-sys\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.418241 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.418160 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-proc\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.418241 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.418199 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af69fe19-1e44-4aca-95f4-4f012390cff5-lib-modules\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.425445 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.425425 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdnf\" (UniqueName: \"kubernetes.io/projected/af69fe19-1e44-4aca-95f4-4f012390cff5-kube-api-access-6jdnf\") pod \"perf-node-gather-daemonset-dg4rx\" (UID: \"af69fe19-1e44-4aca-95f4-4f012390cff5\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.450778 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.450729 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-476qf_842de44e-cb5c-472a-9cf5-7d48346188d8/dns-node-resolver/0.log" Apr 23 17:24:26.511102 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.511078 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.642537 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.642509 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx"] Apr 23 17:24:26.645670 ip-10-0-141-189 kubenswrapper[2561]: W0423 17:24:26.645642 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaf69fe19_1e44_4aca_95f4_4f012390cff5.slice/crio-0d42663880637397d99aa5932ab6488955595a6cf5be21a7213da6c18e4cbafb WatchSource:0}: Error finding container 0d42663880637397d99aa5932ab6488955595a6cf5be21a7213da6c18e4cbafb: Status 404 returned error can't find the container with id 0d42663880637397d99aa5932ab6488955595a6cf5be21a7213da6c18e4cbafb Apr 23 17:24:26.795965 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.795929 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" event={"ID":"af69fe19-1e44-4aca-95f4-4f012390cff5","Type":"ContainerStarted","Data":"5c92c690a6d84a66140709cf6937b5c8501d4be6e9b68eaccbfe21b1b9b14273"} Apr 23 17:24:26.796113 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.795973 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" event={"ID":"af69fe19-1e44-4aca-95f4-4f012390cff5","Type":"ContainerStarted","Data":"0d42663880637397d99aa5932ab6488955595a6cf5be21a7213da6c18e4cbafb"} Apr 23 17:24:26.796217 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.796195 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:26.811952 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.811906 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" podStartSLOduration=0.811890239 podStartE2EDuration="811.890239ms" podCreationTimestamp="2026-04-23 17:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:24:26.810690899 +0000 UTC m=+2961.364125081" watchObservedRunningTime="2026-04-23 17:24:26.811890239 +0000 UTC m=+2961.365324400" Apr 23 17:24:26.887228 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:26.887200 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7wntm_9fd1b1de-6101-4e7d-a523-325e848a740a/node-ca/0.log" Apr 23 17:24:27.836343 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:27.836318 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-49hdt_c2a76a12-982a-438c-837c-0e7665a6f46c/serve-healthcheck-canary/0.log" Apr 23 17:24:28.295458 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:28.295432 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ps65z_148ee375-144e-4112-aa97-25371781944e/kube-rbac-proxy/0.log" Apr 23 17:24:28.311182 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:28.311161 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ps65z_148ee375-144e-4112-aa97-25371781944e/exporter/0.log" Apr 23 17:24:28.328386 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:28.328366 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ps65z_148ee375-144e-4112-aa97-25371781944e/extractor/0.log" Apr 23 17:24:30.230659 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:30.230628 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-5b898d7b9d-l9fp8_d241cedd-49d9-4bbc-88a1-81beb91b298d/manager/0.log" Apr 23 17:24:30.265918 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:30.265895 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-mflqr_e8542bf7-709a-4750-9a1a-1f87ef41d2cd/server/0.log" Apr 23 17:24:32.810794 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:32.810768 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-dg4rx" Apr 23 17:24:34.995217 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:34.995141 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-24kss_6b541e1a-ac49-4260-93b3-d5e6e7e04eb5/kube-multus/0.log" Apr 23 17:24:35.044675 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.044649 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6tcrq_3b9f6eca-1a5a-42f3-acf5-01a9d352a780/kube-multus-additional-cni-plugins/0.log" Apr 23 17:24:35.062262 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.062222 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6tcrq_3b9f6eca-1a5a-42f3-acf5-01a9d352a780/egress-router-binary-copy/0.log" Apr 23 17:24:35.081501 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.081405 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6tcrq_3b9f6eca-1a5a-42f3-acf5-01a9d352a780/cni-plugins/0.log" Apr 23 17:24:35.103160 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.103131 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6tcrq_3b9f6eca-1a5a-42f3-acf5-01a9d352a780/bond-cni-plugin/0.log" Apr 23 17:24:35.120034 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.120013 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6tcrq_3b9f6eca-1a5a-42f3-acf5-01a9d352a780/routeoverride-cni/0.log" Apr 23 17:24:35.138126 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.138106 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6tcrq_3b9f6eca-1a5a-42f3-acf5-01a9d352a780/whereabouts-cni-bincopy/0.log" Apr 23 17:24:35.157251 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.157220 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6tcrq_3b9f6eca-1a5a-42f3-acf5-01a9d352a780/whereabouts-cni/0.log" Apr 23 17:24:35.583239 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.583206 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rg2cr_5c564efe-4a26-4498-9f97-d71703d0aa18/network-metrics-daemon/0.log" Apr 23 17:24:35.600972 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:35.600936 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rg2cr_5c564efe-4a26-4498-9f97-d71703d0aa18/kube-rbac-proxy/0.log" Apr 23 17:24:36.774729 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:36.774696 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhhzs_1f8739d4-2dae-4fa1-b0be-80c6e24dce30/ovn-controller/0.log" Apr 23 17:24:36.803921 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:36.803892 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhhzs_1f8739d4-2dae-4fa1-b0be-80c6e24dce30/ovn-acl-logging/0.log" Apr 23 17:24:36.823802 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:36.823777 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhhzs_1f8739d4-2dae-4fa1-b0be-80c6e24dce30/kube-rbac-proxy-node/0.log" Apr 23 17:24:36.841225 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:36.841204 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhhzs_1f8739d4-2dae-4fa1-b0be-80c6e24dce30/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:24:36.861659 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:36.861630 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhhzs_1f8739d4-2dae-4fa1-b0be-80c6e24dce30/northd/0.log" Apr 23 17:24:36.881404 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:36.881332 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhhzs_1f8739d4-2dae-4fa1-b0be-80c6e24dce30/nbdb/0.log" Apr 23 17:24:36.899158 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:36.899128 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhhzs_1f8739d4-2dae-4fa1-b0be-80c6e24dce30/sbdb/0.log" Apr 23 17:24:37.020281 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:37.020254 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhhzs_1f8739d4-2dae-4fa1-b0be-80c6e24dce30/ovnkube-controller/0.log" Apr 23 17:24:37.970805 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:37.970772 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bclp4_2f8d3b70-fe21-4feb-a984-12133895766b/network-check-target-container/0.log" Apr 23 17:24:38.845929 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:38.845903 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hvjxm_0b73ac93-0681-4ba9-8f35-d63197135397/iptables-alerter/0.log" Apr 23 17:24:39.435355 ip-10-0-141-189 kubenswrapper[2561]: I0423 17:24:39.435334 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-gmwdt_543e930d-5825-406c-b2da-236f7eef2b83/tuned/0.log"