Apr 22 14:15:17.910720 ip-10-0-129-161 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:18.337540 ip-10-0-129-161 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:18.337540 ip-10-0-129-161 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:18.337540 ip-10-0-129-161 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:18.337540 ip-10-0-129-161 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:18.337540 ip-10-0-129-161 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:18.340002 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.339913 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:18.343758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343742 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:18.343758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343758 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343762 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343765 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343767 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343770 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343773 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343776 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343778 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343781 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343784 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343787 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343790 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343797 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343800 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343803 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343806 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343808 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343811 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343814 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343816 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:18.343820 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343819 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343822 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343825 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343828 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343830 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343833 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343838 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343842 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343845 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343848 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343851 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343853 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343856 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343859 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343861 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343864 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343866 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343869 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343871 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:18.344295 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343874 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343876 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343879 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343881 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343884 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343886 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343894 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343897 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343899 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343902 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343904 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343907 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343909 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343912 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343914 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343918 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343921 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343925 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343927 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343930 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:18.344766 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343932 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343935 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343938 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343940 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343943 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343946 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343949 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343951 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343954 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343958 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343961 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343964 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343967 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343970 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343972 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343975 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343978 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343980 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343983 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:18.345289 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343991 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343995 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.343998 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344000 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344003 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344006 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344008 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344461 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344469 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344472 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344475 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344479 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344481 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344484 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344487 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344490 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344493 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344495 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344497 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:18.345758 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344500 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344503 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344506 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344508 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344512 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344516 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344518 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344521 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344524 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344526 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344529 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344531 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344534 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344537 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344540 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344542 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344545 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344547 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344550 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:18.346221 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344553 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344555 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344558 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344561 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344563 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344566 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344568 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344571 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344573 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344576 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344579 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344581 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344584 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344586 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344589 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344591 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344594 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344596 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344599 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344601 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:18.346718 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344604 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344606 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344609 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344611 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344615 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344620 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344623 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344626 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344628 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344631 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344633 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344636 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344639 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344642 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344645 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344647 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344650 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344653 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344655 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344658 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:18.347213 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344661 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344664 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344667 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344669 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344671 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344674 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344676 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344679 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344681 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344684 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344686 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344689 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344691 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344694 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.344696 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346068 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346077 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346083 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346088 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346094 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346098 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:18.347721 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346102 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346107 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346110 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346114 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346117 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346121 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346124 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346127 2566 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346130 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346133 2566 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346136 2566 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346139 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346142 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346146 2566 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346149 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346153 2566 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346156 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346159 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346163 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346166 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346169 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346172 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346176 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346178 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:18.348253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346181 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346185 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346188 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346193 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346196 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346199 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346202 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346205 2566 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346209 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346213 2566 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346216 2566 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346223 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346227 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346230 2566 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346234 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346237 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346240 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346243 2566 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346246 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346249 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346252 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346255 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346258 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346261 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346264 2566 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:18.348919 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346267 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346271 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346274 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346277 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346280 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346283 2566 flags.go:64] FLAG: --help="false" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346286 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346289 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346292 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346295 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346299 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346303 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346305 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346309 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346312 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346315 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346318 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346322 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346326 2566 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346329 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346332 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346335 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346338 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346341 2566 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:18.349544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346343 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346347 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346350 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346355 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346358 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346361 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346364 2566 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346367 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346370 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346373 2566 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346376 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346380 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346383 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346388 2566 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346391 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346394 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346397 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346400 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346403 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346406 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346409 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346416 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346419 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346422 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:18.350164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346426 2566 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346442 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346448 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346453 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346457 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346460 2566 flags.go:64] FLAG: --port="10250" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346463 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346466 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e9433a27766a6b0d" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346469 2566 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346472 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346475 2566 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346478 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346481 2566 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346485 2566 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346488 2566 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346491 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346494 2566 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346498 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346501 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346504 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346507 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346510 2566 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346512 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346516 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346519 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:18.350826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346522 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346525 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346528 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346531 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346536 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346539 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346542 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346545 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346548 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346551 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346555 2566 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346558 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346564 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346567 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346570 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346574 2566 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346577 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346580 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346583 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346586 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346589 2566 flags.go:64] FLAG: --v="2" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346593 2566 flags.go:64] FLAG: --version="false" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346597 2566 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346601 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346604 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:18.351426 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346692 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346696 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346699 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346702 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346705 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346708 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346712 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346715 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346717 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346721 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346724 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346730 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346733 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346736 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346739 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346743 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346746 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346752 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346755 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346758 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:18.352079 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346761 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346764 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346767 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346770 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346772 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346775 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346777 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346780 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346783 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346785 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346788 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346790 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346793 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346796 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346799 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346801 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346804 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346807 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346809 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:18.352769 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346811 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346814 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346816 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346819 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346823 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346826 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346829 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346831 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346834 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346837 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346841 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346845 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346849 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346851 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346854 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346857 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346859 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346862 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346864 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:18.353393 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346867 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346870 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346872 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346875 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346877 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346880 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346883 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346885 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346888 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346890 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346893 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346895 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346898 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346900 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346903 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346906 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346908 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346912 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346914 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346917 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:18.353918 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346920 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:18.354501 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346922 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:18.354501 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346925 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:18.354501 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346929 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:18.354501 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346932 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:18.354501 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346935 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:18.354501 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346937 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:18.354501 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.346940 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:18.354501 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.346945 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:18.355200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.355181 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:18.355233 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.355202 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:18.355261 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355249 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:18.355261 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355254 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:18.355261 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355258 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:18.355261 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355261 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355264 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355267 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355270 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355272 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355275 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355279 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355284 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355287 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355290 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355293 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355296 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355299 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355302 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355304 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355307 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355311 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355315 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355318 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:18.355368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355320 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355323 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355325 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355340 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355343 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355346 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355349 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355352 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355355 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355358 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355360 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355363 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355365 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355368 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355371 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355374 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355376 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355379 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355382 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355385 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:18.355940 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355387 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355390 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355392 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355395 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355397 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355400 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355402 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355405 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355407 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355410 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355413 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355415 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355418 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355420 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355423 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355426 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355429 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355448 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355452 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355457 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:18.356420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355461 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355465 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355469 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355472 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355475 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355477 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355480 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355483 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355485 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355488 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355491 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355494 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355496 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355499 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355508 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355511 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355514 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355516 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355519 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355522 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:18.356934 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355524 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355527 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355530 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355532 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.355538 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355630 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355635 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355639 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355642 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355644 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355647 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355650 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355653 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355656 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355659 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:18.357420 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355662 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355664 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355667 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355669 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355672 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355674 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355677 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355680 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355682 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355684 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355687 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355690 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355700 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355703 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355705 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355708 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355711 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355713 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355716 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:18.357806 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355718 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355721 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355724 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355727 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355730 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355732 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355735 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355738 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355740 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355742 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355745 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355747 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355750 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355752 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355755 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355757 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355760 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355762 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355765 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355767 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:18.358311 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355770 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355772 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355774 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355777 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355780 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355782 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355792 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355795 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355797 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355800 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355802 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355805 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355807 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355810 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355813 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355815 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355818 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355820 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355823 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355826 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:18.358811 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355829 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355831 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355834 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355836 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355839 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355841 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355843 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355846 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355848 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355851 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355854 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355856 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355860 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355863 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355866 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355868 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:18.359298 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:18.355871 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:18.359695 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.355876 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:18.359695 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.356518 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:18.360563 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.360549 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:18.361498 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.361488 2566 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:18.361609 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.361592 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:18.361642 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.361628 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:18.389138 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.389121 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:18.394772 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.394749 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:18.423984 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.423965 2566 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:18.425497 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.425479 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:18.429028 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.429016 2566 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:18.430214 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.430200 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:18.433889 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.433865 2566 fs.go:135] Filesystem UUIDs: map[04890cdf-7d77-46b4-91a6-ccce5b1d66a7:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 cbed1c74-2408-4f81-809d-b8065d55f9c8:/dev/nvme0n1p3] Apr 22 14:15:18.433943 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.433888 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:18.441126 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.441012 2566 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:18.438778576 +0000 UTC m=+0.418289946 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3106414 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25505e79c94fee9ab7ba5ffa783a30 SystemUUID:ec25505e-79c9-4fee-9ab7-ba5ffa783a30 BootID:6f8a80d3-6a82-45d9-bced-83fee2cc4d62 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5c:4d:29:d0:25 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5c:4d:29:d0:25 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:7b:1b:41:a2:ac Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:18.441126 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.441115 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:18.441256 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.441198 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:18.442392 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.442370 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:18.442584 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.442395 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-161.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:18.442631 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.442594 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:18.442631 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.442603 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:18.442631 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.442616 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:18.443470 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.443459 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:18.445525 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.445515 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:18.445801 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.445791 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:18.448384 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.448374 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:18.448420 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.448391 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:18.448420 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.448403 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:18.448420 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.448413 2566 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:18.448551 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.448422 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:18.449758 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.449747 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:18.449799 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.449765 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:18.455097 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.455081 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:18.458746 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.458732 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:18.461626 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461613 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461630 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461636 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461641 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461648 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461653 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461659 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461664 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461670 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461676 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461686 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:18.461697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.461694 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:18.463077 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.463057 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-161.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:18.463138 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.463078 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:18.463178 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.463154 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:18.463178 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.463162 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:18.466743 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.466730 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:18.466805 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.466766 2566 server.go:1295] "Started kubelet" Apr 22 14:15:18.466882 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.466856 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:18.466937 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.466893 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:18.466967 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.466960 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:18.467792 ip-10-0-129-161 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:18.471562 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.471547 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:18.471953 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.471933 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nlwts" Apr 22 14:15:18.473545 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.473527 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:18.473887 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.473873 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-161.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:15:18.479975 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.479954 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nlwts" Apr 22 14:15:18.482829 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.482808 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:15:18.484024 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.482785 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-161.ec2.internal.18a8b3697236066f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-161.ec2.internal,UID:ip-10-0-129-161.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-161.ec2.internal,},FirstTimestamp:2026-04-22 14:15:18.466741871 +0000 UTC m=+0.446253241,LastTimestamp:2026-04-22 14:15:18.466741871 +0000 UTC m=+0.446253241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-161.ec2.internal,}" Apr 22 14:15:18.489527 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.489511 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:18.490273 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.490254 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:18.490923 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.490906 2566 factory.go:55] Registering systemd factory Apr 22 14:15:18.491006 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.490928 2566 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:18.492428 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492295 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:18.492512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492307 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:18.492512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492500 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:18.492596 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.492545 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:18.492649 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492641 2566 factory.go:153] Registering CRI-O factory Apr 22 14:15:18.492691 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492655 2566 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:18.492741 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492697 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:18.492741 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492643 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:18.492741 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492719 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:18.492741 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492730 2566 factory.go:103] Registering Raw factory Apr 22 14:15:18.492900 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.492745 2566 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:18.493371 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.493354 2566 manager.go:319] Starting recovery of all containers Apr 22 14:15:18.503462 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.503318 2566 manager.go:324] Recovery completed Apr 22 14:15:18.503828 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.503811 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:18.507535 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.507518 2566 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-161.ec2.internal\" not found" node="ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.507667 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.507656 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:18.509884 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.509870 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:18.509969 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.509895 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:18.509969 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.509905 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:18.510357 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.510342 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:18.510357 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.510356 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:18.510493 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.510377 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:18.512407 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.512392 2566 policy_none.go:49] "None policy: Start" Apr 22 14:15:18.512515 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.512414 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:18.512515 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.512427 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:18.556021 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.556005 2566 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.556035 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.556048 2566 server.go:85] "Starting device plugin registration server" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.556318 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.556330 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.556441 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.556517 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.556527 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.557102 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:18.564289 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.557142 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:18.627244 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.627215 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:18.628415 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.628395 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:18.628539 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.628420 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:18.628539 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.628503 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:18.628539 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.628513 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:18.628678 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.628554 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:18.631254 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.631237 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:18.656586 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.656566 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:18.658081 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.658063 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:18.658156 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.658095 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:18.658156 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.658108 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:18.658156 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.658140 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.667483 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.667461 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.667541 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.667486 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-161.ec2.internal\": node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:18.683176 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.683156 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:18.728770 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.728739 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal"] Apr 22 14:15:18.728880 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.728834 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:18.729771 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.729754 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:18.729922 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.729788 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:18.729922 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.729801 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:18.731021 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731005 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:18.731191 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731177 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.731235 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731226 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:18.731728 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731712 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:18.731820 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731735 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:18.731820 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731744 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:18.731820 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731750 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:18.731820 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731781 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:18.731820 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.731791 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:18.732806 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.732788 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.732883 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.732815 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:18.733405 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.733388 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:18.733490 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.733415 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:18.733490 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.733425 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:18.756985 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.756966 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-161.ec2.internal\" not found" node="ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.761311 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.761296 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-161.ec2.internal\" not found" node="ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.783346 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.783328 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:18.793742 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.793722 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/725459ae6e0afb73747d2931979035a7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal\" (UID: \"725459ae6e0afb73747d2931979035a7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.793855 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.793749 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a7add03423232b9ddbd2ad4e8a3d9c3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-161.ec2.internal\" (UID: \"2a7add03423232b9ddbd2ad4e8a3d9c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.793855 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.793771 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/725459ae6e0afb73747d2931979035a7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal\" (UID: \"725459ae6e0afb73747d2931979035a7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.883842 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.883745 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:18.894282 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.894258 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/725459ae6e0afb73747d2931979035a7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal\" (UID: \"725459ae6e0afb73747d2931979035a7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.894339 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.894290 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/725459ae6e0afb73747d2931979035a7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal\" (UID: \"725459ae6e0afb73747d2931979035a7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.894339 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.894309 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a7add03423232b9ddbd2ad4e8a3d9c3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-161.ec2.internal\" (UID: \"2a7add03423232b9ddbd2ad4e8a3d9c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.894404 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.894357 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a7add03423232b9ddbd2ad4e8a3d9c3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-161.ec2.internal\" (UID: \"2a7add03423232b9ddbd2ad4e8a3d9c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.894404 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.894373 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/725459ae6e0afb73747d2931979035a7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal\" (UID: \"725459ae6e0afb73747d2931979035a7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.894484 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:18.894410 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/725459ae6e0afb73747d2931979035a7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal\" (UID: \"725459ae6e0afb73747d2931979035a7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:18.984538 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:18.984496 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.059123 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.059092 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:19.062767 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.062747 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" Apr 22 14:15:19.085411 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:19.085376 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.186070 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:19.185967 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.286633 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:19.286591 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.361212 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.361177 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:19.361816 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.361334 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:19.361816 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.361357 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:19.387358 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:19.387322 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.484728 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.484693 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:18 +0000 UTC" deadline="2027-11-22 16:05:32.554639551 +0000 UTC" Apr 22 14:15:19.484728 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.484721 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13897h50m13.069920253s" Apr 22 14:15:19.487704 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:19.487679 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.489862 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.489840 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:19.506283 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.506259 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:19.532039 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.532005 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wgt8v" Apr 22 14:15:19.542343 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.542320 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wgt8v" Apr 22 14:15:19.543953 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.543936 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:19.575978 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:19.575945 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7add03423232b9ddbd2ad4e8a3d9c3.slice/crio-b7e22afbf86d1682a60be714af803ddbd36401549ba11fff31221cc65ab4b9a4 WatchSource:0}: Error finding container b7e22afbf86d1682a60be714af803ddbd36401549ba11fff31221cc65ab4b9a4: Status 404 returned error can't find the container with id b7e22afbf86d1682a60be714af803ddbd36401549ba11fff31221cc65ab4b9a4 Apr 22 14:15:19.576168 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:19.576142 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod725459ae6e0afb73747d2931979035a7.slice/crio-9445617f3c41d684aef9bee7a798b48e294519eca88d2e02089fd722e608cfb5 WatchSource:0}: Error finding container 9445617f3c41d684aef9bee7a798b48e294519eca88d2e02089fd722e608cfb5: Status 404 returned error can't find the container with id 9445617f3c41d684aef9bee7a798b48e294519eca88d2e02089fd722e608cfb5 Apr 22 14:15:19.579848 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.579832 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:19.588575 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:19.588546 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.632120 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.632064 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" event={"ID":"2a7add03423232b9ddbd2ad4e8a3d9c3","Type":"ContainerStarted","Data":"b7e22afbf86d1682a60be714af803ddbd36401549ba11fff31221cc65ab4b9a4"} Apr 22 14:15:19.633054 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.633032 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" event={"ID":"725459ae6e0afb73747d2931979035a7","Type":"ContainerStarted","Data":"9445617f3c41d684aef9bee7a798b48e294519eca88d2e02089fd722e608cfb5"} Apr 22 14:15:19.689242 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:19.689210 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.790020 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:19.789951 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-161.ec2.internal\" not found" Apr 22 14:15:19.791650 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.791633 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:19.891126 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.891081 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" Apr 22 14:15:19.903037 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.903014 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:19.904032 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.903855 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" Apr 22 14:15:19.916204 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:19.916184 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:20.324988 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.324948 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:20.449953 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.449921 2566 apiserver.go:52] "Watching apiserver" Apr 22 14:15:20.461002 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.460973 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:20.461451 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.461416 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-4l7kw","openshift-dns/node-resolver-6rsgf","openshift-image-registry/node-ca-zmgl9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal","openshift-multus/multus-additional-cni-plugins-ng46h","openshift-multus/multus-nwk64","openshift-multus/network-metrics-daemon-9rgrl","kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn","openshift-network-diagnostics/network-check-target-w9djl","openshift-network-operator/iptables-alerter-l8hc9","openshift-ovn-kubernetes/ovnkube-node-47psb","kube-system/konnectivity-agent-rnwmp"] Apr 22 14:15:20.463570 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.463551 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:20.463683 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:20.463621 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:20.464789 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.464761 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.465811 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.465793 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.467003 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.466881 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.467247 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.467206 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:20.467247 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.467217 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:20.467426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.467295 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-972hd\"" Apr 22 14:15:20.468085 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.468066 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:20.468258 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.468239 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:20.469133 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:20.469074 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:20.469221 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.469138 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:20.469458 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.469427 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:20.470600 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.469655 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:20.470600 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.470196 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:20.470600 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.470288 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qnq4b\"" Apr 22 14:15:20.470600 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.470397 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:20.470826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.470621 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4mvbk\"" Apr 22 14:15:20.472666 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.472356 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.472666 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.472478 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.473648 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.473615 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.474631 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.474611 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:20.474725 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.474654 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:20.475006 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.474988 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:20.475006 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.474989 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74vdz\"" Apr 22 14:15:20.475140 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.475020 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:20.475140 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.475052 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.475140 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.475075 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8tqhs\"" Apr 22 14:15:20.475715 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.475700 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:20.476405 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.476114 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:20.476405 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.476262 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:20.476405 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.476282 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:20.476405 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.476401 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gn89g\"" Apr 22 14:15:20.476701 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.476676 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:20.477522 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.477450 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.477522 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.477515 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:20.477929 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.477908 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:20.478015 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.477908 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dsq29\"" Apr 22 14:15:20.478015 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.477913 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:20.478015 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.477913 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:20.478662 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.478605 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:20.478766 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.478748 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:20.478766 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.478758 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:20.478895 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.478801 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:20.478895 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.478874 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8rdvj\"" Apr 22 14:15:20.479557 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.479539 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:20.479903 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.479886 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:20.480216 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.480193 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:20.480303 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.480254 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fkrmd\"" Apr 22 14:15:20.493276 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.493260 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:20.503005 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.502978 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgsgv\" (UniqueName: \"kubernetes.io/projected/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-kube-api-access-mgsgv\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:20.503096 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503065 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.503142 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503099 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-host\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.503142 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503123 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6a11fb65-b996-42de-a115-49420effa19b-agent-certs\") pod \"konnectivity-agent-rnwmp\" (UID: \"6a11fb65-b996-42de-a115-49420effa19b\") " pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:20.503231 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503147 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e1454ff-e291-42e9-8bb6-cd922139fd02-cni-binary-copy\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.503231 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503173 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-netns\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.503231 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-systemd\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.503346 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503263 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-env-overrides\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.503346 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503332 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-ovnkube-script-lib\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.503459 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503391 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-device-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.503518 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503469 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvk6v\" (UniqueName: \"kubernetes.io/projected/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-kube-api-access-jvk6v\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.503518 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503499 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bccc745a-d0a3-4d47-bb03-7502b82f4a26-host-slash\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.503618 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503524 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.503618 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503580 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-system-cni-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.503703 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503623 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-sys-fs\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.503703 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503646 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-run\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.503703 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503668 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-cni-bin\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.503703 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503690 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-ovnkube-config\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.503878 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503715 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/033bc69f-f51f-40ca-8484-1ae2dc580b53-tmp-dir\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.503878 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503740 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfftn\" (UniqueName: \"kubernetes.io/projected/033bc69f-f51f-40ca-8484-1ae2dc580b53-kube-api-access-lfftn\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.503878 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503790 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-system-cni-dir\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.503878 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503840 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-modprobe-d\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.503878 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503867 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-kubelet\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.504049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503893 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.504049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503919 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4nfk\" (UniqueName: \"kubernetes.io/projected/d37f6164-ab7b-4939-a74e-19ab726827bb-kube-api-access-j4nfk\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.504049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503945 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44867\" (UniqueName: \"kubernetes.io/projected/57899f5d-95c9-4f88-8a37-538507647859-kube-api-access-44867\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.504049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.503968 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-conf-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.504049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504000 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-kubernetes\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.504049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504022 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-node-log\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.504049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504044 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99cq\" (UniqueName: \"kubernetes.io/projected/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-kube-api-access-l99cq\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504067 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-daemon-config\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504095 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-systemd\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504131 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-ovn\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504162 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6a11fb65-b996-42de-a115-49420effa19b-konnectivity-ca\") pod \"konnectivity-agent-rnwmp\" (UID: \"6a11fb65-b996-42de-a115-49420effa19b\") " pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504190 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysctl-conf\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504213 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-sys\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504242 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-log-socket\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504270 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.504356 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504321 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bccc745a-d0a3-4d47-bb03-7502b82f4a26-iptables-alerter-script\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504368 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-hostroot\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504397 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-etc-selinux\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504422 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-os-release\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504463 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-cni-bin\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504487 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-multus-certs\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504510 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysctl-d\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504534 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-tuned\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504585 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-registration-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504635 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504685 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.504716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504711 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-cni-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504740 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-etc-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504798 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-host\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504827 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-serviceca\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504852 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504875 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx8s7\" (UniqueName: \"kubernetes.io/projected/bccc745a-d0a3-4d47-bb03-7502b82f4a26-kube-api-access-jx8s7\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504927 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-k8s-cni-cncf-io\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504953 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-kubelet\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504976 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-var-lib-kubelet\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.504998 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-lib-modules\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505023 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505047 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d37f6164-ab7b-4939-a74e-19ab726827bb-ovn-node-metrics-cert\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505088 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/033bc69f-f51f-40ca-8484-1ae2dc580b53-hosts-file\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505115 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-socket-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505130 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-cnibin\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505145 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-etc-kubernetes\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.505174 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505168 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68x8\" (UniqueName: \"kubernetes.io/projected/c9e3f13d-f48a-4ce7-a59d-16c11e660545-kube-api-access-z68x8\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505188 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-systemd-units\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505222 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-run-netns\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505250 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505276 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-cnibin\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505305 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysconfig\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505326 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-slash\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505351 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-cni-netd\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505374 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-socket-dir-parent\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505398 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-cni-multus\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505425 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49mjr\" (UniqueName: \"kubernetes.io/projected/1e1454ff-e291-42e9-8bb6-cd922139fd02-kube-api-access-49mjr\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505482 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9e3f13d-f48a-4ce7-a59d-16c11e660545-tmp\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505510 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-var-lib-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505527 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-os-release\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.505822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.505548 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-cni-binary-copy\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.543211 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.543178 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:19 +0000 UTC" deadline="2027-11-01 20:44:51.226686712 +0000 UTC" Apr 22 14:15:20.543211 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.543208 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13398h29m30.683481505s" Apr 22 14:15:20.606121 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606092 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-cni-bin\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606134 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-ovnkube-config\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606163 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/033bc69f-f51f-40ca-8484-1ae2dc580b53-tmp-dir\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606186 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfftn\" (UniqueName: \"kubernetes.io/projected/033bc69f-f51f-40ca-8484-1ae2dc580b53-kube-api-access-lfftn\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606185 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-cni-bin\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606210 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-system-cni-dir\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606233 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-modprobe-d\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606255 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-kubelet\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606273 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.606290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606288 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4nfk\" (UniqueName: \"kubernetes.io/projected/d37f6164-ab7b-4939-a74e-19ab726827bb-kube-api-access-j4nfk\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606313 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44867\" (UniqueName: \"kubernetes.io/projected/57899f5d-95c9-4f88-8a37-538507647859-kube-api-access-44867\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606337 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-conf-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606363 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-kubernetes\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606359 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-system-cni-dir\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606453 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-modprobe-d\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606516 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-kubelet\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606558 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-kubernetes\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606521 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-conf-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606577 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/033bc69f-f51f-40ca-8484-1ae2dc580b53-tmp-dir\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606568 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.606716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606691 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-node-log\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606737 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-node-log\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l99cq\" (UniqueName: \"kubernetes.io/projected/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-kube-api-access-l99cq\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606785 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-daemon-config\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-systemd\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606843 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-ovn\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606859 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-ovnkube-config\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606866 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6a11fb65-b996-42de-a115-49420effa19b-konnectivity-ca\") pod \"konnectivity-agent-rnwmp\" (UID: \"6a11fb65-b996-42de-a115-49420effa19b\") " pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606891 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysctl-conf\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606908 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-systemd\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606913 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-ovn\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606982 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-sys\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.606926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-sys\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607032 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-log-socket\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607051 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysctl-conf\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607064 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607095 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bccc745a-d0a3-4d47-bb03-7502b82f4a26-iptables-alerter-script\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607121 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-hostroot\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.607200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-etc-selinux\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607169 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-os-release\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607195 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-hostroot\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607097 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-log-socket\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607234 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-os-release\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607234 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-cni-bin\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607265 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-cni-bin\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-multus-certs\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607292 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-etc-selinux\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607296 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-multus-certs\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607299 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysctl-d\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607347 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-tuned\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607375 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-registration-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607395 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysctl-d\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607403 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607423 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-daemon-config\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607464 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-registration-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607462 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6a11fb65-b996-42de-a115-49420effa19b-konnectivity-ca\") pod \"konnectivity-agent-rnwmp\" (UID: \"6a11fb65-b996-42de-a115-49420effa19b\") " pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:20.608009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607492 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607531 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-cni-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607559 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-etc-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:20.607562 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607584 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-host\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607618 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-cni-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607607 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-serviceca\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:20.607674 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:21.107619956 +0000 UTC m=+3.087131333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607676 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-etc-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607688 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-host\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607747 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx8s7\" (UniqueName: \"kubernetes.io/projected/bccc745a-d0a3-4d47-bb03-7502b82f4a26-kube-api-access-jx8s7\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607774 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-k8s-cni-cncf-io\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607825 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-k8s-cni-cncf-io\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607854 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607889 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bccc745a-d0a3-4d47-bb03-7502b82f4a26-iptables-alerter-script\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.608822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607934 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-kubelet\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.607970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-var-lib-kubelet\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608001 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-lib-modules\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608038 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-kubelet\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608041 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-serviceca\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608053 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-var-lib-kubelet\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608074 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608114 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d37f6164-ab7b-4939-a74e-19ab726827bb-ovn-node-metrics-cert\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608122 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-lib-modules\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/033bc69f-f51f-40ca-8484-1ae2dc580b53-hosts-file\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608166 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-socket-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608193 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-cnibin\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608222 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/033bc69f-f51f-40ca-8484-1ae2dc580b53-hosts-file\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608236 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-etc-kubernetes\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608248 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-cnibin\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608263 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68x8\" (UniqueName: \"kubernetes.io/projected/c9e3f13d-f48a-4ce7-a59d-16c11e660545-kube-api-access-z68x8\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.609614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608287 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-systemd-units\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-etc-kubernetes\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608310 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-run-netns\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608322 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-socket-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608334 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608361 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-cnibin\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608367 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-systemd-units\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608384 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysconfig\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608385 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608403 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608408 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-slash\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608458 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-cni-netd\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608467 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-run-netns\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608475 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-cnibin\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-socket-dir-parent\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608509 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-slash\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608520 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-host-cni-netd\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608542 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-cni-multus\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.610286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608547 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-sysconfig\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608543 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-multus-socket-dir-parent\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608571 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49mjr\" (UniqueName: \"kubernetes.io/projected/1e1454ff-e291-42e9-8bb6-cd922139fd02-kube-api-access-49mjr\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608592 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-var-lib-cni-multus\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608594 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9e3f13d-f48a-4ce7-a59d-16c11e660545-tmp\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608637 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-var-lib-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608664 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-os-release\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608677 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-var-lib-openvswitch\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-cni-binary-copy\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608716 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgsgv\" (UniqueName: \"kubernetes.io/projected/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-kube-api-access-mgsgv\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608741 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-os-release\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608855 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-host\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608877 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6a11fb65-b996-42de-a115-49420effa19b-agent-certs\") pod \"konnectivity-agent-rnwmp\" (UID: \"6a11fb65-b996-42de-a115-49420effa19b\") " pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608901 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e1454ff-e291-42e9-8bb6-cd922139fd02-cni-binary-copy\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608927 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-netns\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608942 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.611076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608967 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-systemd\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608987 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-host\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.608996 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-env-overrides\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609032 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-ovnkube-script-lib\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609061 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-device-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609086 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvk6v\" (UniqueName: \"kubernetes.io/projected/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-kube-api-access-jvk6v\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bccc745a-d0a3-4d47-bb03-7502b82f4a26-host-slash\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609156 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609181 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-system-cni-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609189 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57899f5d-95c9-4f88-8a37-538507647859-cni-binary-copy\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609212 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-sys-fs\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609270 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-run\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609359 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c9e3f13d-f48a-4ce7-a59d-16c11e660545-run\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609427 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bccc745a-d0a3-4d47-bb03-7502b82f4a26-host-slash\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609465 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-env-overrides\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609476 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e1454ff-e291-42e9-8bb6-cd922139fd02-cni-binary-copy\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609530 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-system-cni-dir\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609552 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-sys-fs\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.611729 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609575 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e1454ff-e291-42e9-8bb6-cd922139fd02-host-run-netns\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.612377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609470 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-device-dir\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.612377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609600 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d37f6164-ab7b-4939-a74e-19ab726827bb-run-systemd\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.612377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609610 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57899f5d-95c9-4f88-8a37-538507647859-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.612377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.609911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d37f6164-ab7b-4939-a74e-19ab726827bb-ovnkube-script-lib\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.612377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.611554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d37f6164-ab7b-4939-a74e-19ab726827bb-ovn-node-metrics-cert\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.612377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.611564 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9e3f13d-f48a-4ce7-a59d-16c11e660545-tmp\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.612377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.611554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c9e3f13d-f48a-4ce7-a59d-16c11e660545-etc-tuned\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.612377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.611760 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6a11fb65-b996-42de-a115-49420effa19b-agent-certs\") pod \"konnectivity-agent-rnwmp\" (UID: \"6a11fb65-b996-42de-a115-49420effa19b\") " pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:20.615042 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.614712 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfftn\" (UniqueName: \"kubernetes.io/projected/033bc69f-f51f-40ca-8484-1ae2dc580b53-kube-api-access-lfftn\") pod \"node-resolver-6rsgf\" (UID: \"033bc69f-f51f-40ca-8484-1ae2dc580b53\") " pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.615042 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.615000 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44867\" (UniqueName: \"kubernetes.io/projected/57899f5d-95c9-4f88-8a37-538507647859-kube-api-access-44867\") pod \"multus-additional-cni-plugins-ng46h\" (UID: \"57899f5d-95c9-4f88-8a37-538507647859\") " pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.615180 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.615162 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99cq\" (UniqueName: \"kubernetes.io/projected/1c429a6e-0682-4fe6-9ec0-b39e350ccc63-kube-api-access-l99cq\") pod \"node-ca-zmgl9\" (UID: \"1c429a6e-0682-4fe6-9ec0-b39e350ccc63\") " pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.615231 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.615167 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4nfk\" (UniqueName: \"kubernetes.io/projected/d37f6164-ab7b-4939-a74e-19ab726827bb-kube-api-access-j4nfk\") pod \"ovnkube-node-47psb\" (UID: \"d37f6164-ab7b-4939-a74e-19ab726827bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.618641 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:20.618622 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:20.618641 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:20.618641 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:20.618805 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:20.618652 2566 projected.go:194] Error preparing data for projected volume kube-api-access-twxx2 for pod openshift-network-diagnostics/network-check-target-w9djl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:20.618805 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:20.618722 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2 podName:ab2a1f01-aab3-488d-8a5c-09e7a9568954 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:21.118703875 +0000 UTC m=+3.098215235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-twxx2" (UniqueName: "kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2") pod "network-check-target-w9djl" (UID: "ab2a1f01-aab3-488d-8a5c-09e7a9568954") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:20.621985 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.621962 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx8s7\" (UniqueName: \"kubernetes.io/projected/bccc745a-d0a3-4d47-bb03-7502b82f4a26-kube-api-access-jx8s7\") pod \"iptables-alerter-l8hc9\" (UID: \"bccc745a-d0a3-4d47-bb03-7502b82f4a26\") " pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.623958 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.623934 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgsgv\" (UniqueName: \"kubernetes.io/projected/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-kube-api-access-mgsgv\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:20.624597 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.624351 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvk6v\" (UniqueName: \"kubernetes.io/projected/549a6cb6-40be-4bb2-9a43-ba2a9d5ea855-kube-api-access-jvk6v\") pod \"aws-ebs-csi-driver-node-jm7xn\" (UID: \"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.624693 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.624665 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68x8\" (UniqueName: \"kubernetes.io/projected/c9e3f13d-f48a-4ce7-a59d-16c11e660545-kube-api-access-z68x8\") pod \"tuned-4l7kw\" (UID: \"c9e3f13d-f48a-4ce7-a59d-16c11e660545\") " pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.624737 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.624718 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49mjr\" (UniqueName: \"kubernetes.io/projected/1e1454ff-e291-42e9-8bb6-cd922139fd02-kube-api-access-49mjr\") pod \"multus-nwk64\" (UID: \"1e1454ff-e291-42e9-8bb6-cd922139fd02\") " pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.777992 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.777953 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6rsgf" Apr 22 14:15:20.784677 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.784653 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ng46h" Apr 22 14:15:20.792316 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.792292 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nwk64" Apr 22 14:15:20.799082 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.799062 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" Apr 22 14:15:20.805161 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.805135 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" Apr 22 14:15:20.812790 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.812774 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l8hc9" Apr 22 14:15:20.819310 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.819294 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:20.825912 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.825891 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:20.833452 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.833418 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zmgl9" Apr 22 14:15:20.882742 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:20.882681 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:21.112619 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.112589 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:21.112786 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:21.112720 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:21.112835 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:21.112791 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:22.112770349 +0000 UTC m=+4.092281723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:21.189488 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:21.189459 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e3f13d_f48a_4ce7_a59d_16c11e660545.slice/crio-f3d210c5b95fe7e6a5f256c4c941130a3a920db4948a93675ccbda514bc60f7e WatchSource:0}: Error finding container f3d210c5b95fe7e6a5f256c4c941130a3a920db4948a93675ccbda514bc60f7e: Status 404 returned error can't find the container with id f3d210c5b95fe7e6a5f256c4c941130a3a920db4948a93675ccbda514bc60f7e Apr 22 14:15:21.190145 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:21.190121 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd37f6164_ab7b_4939_a74e_19ab726827bb.slice/crio-ba584bdb8800f6f07b41cee314ba0008a12068c0e43937984e0015d03b09da4f WatchSource:0}: Error finding container ba584bdb8800f6f07b41cee314ba0008a12068c0e43937984e0015d03b09da4f: Status 404 returned error can't find the container with id ba584bdb8800f6f07b41cee314ba0008a12068c0e43937984e0015d03b09da4f Apr 22 14:15:21.191261 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:21.191153 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbccc745a_d0a3_4d47_bb03_7502b82f4a26.slice/crio-67c4963cd71455e574d2f242b632286372ef9d36f20ba87559198cecfe703399 WatchSource:0}: Error finding container 67c4963cd71455e574d2f242b632286372ef9d36f20ba87559198cecfe703399: Status 404 returned error can't find the container with id 67c4963cd71455e574d2f242b632286372ef9d36f20ba87559198cecfe703399 Apr 22 14:15:21.195465 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:21.195425 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549a6cb6_40be_4bb2_9a43_ba2a9d5ea855.slice/crio-7df8414170f6caf8c3340b7f8802ddeea0cd938cfb44ce08be5b078e3fb222a1 WatchSource:0}: Error finding container 7df8414170f6caf8c3340b7f8802ddeea0cd938cfb44ce08be5b078e3fb222a1: Status 404 returned error can't find the container with id 7df8414170f6caf8c3340b7f8802ddeea0cd938cfb44ce08be5b078e3fb222a1 Apr 22 14:15:21.199072 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:21.198285 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57899f5d_95c9_4f88_8a37_538507647859.slice/crio-ffe77a847a2411e38e7e3f6250c219a11b882d0079ba1d941153abb34d9e1b84 WatchSource:0}: Error finding container ffe77a847a2411e38e7e3f6250c219a11b882d0079ba1d941153abb34d9e1b84: Status 404 returned error can't find the container with id ffe77a847a2411e38e7e3f6250c219a11b882d0079ba1d941153abb34d9e1b84 Apr 22 14:15:21.200485 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:21.199630 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c429a6e_0682_4fe6_9ec0_b39e350ccc63.slice/crio-bfcd1ca1224cdb80641d46bdfc2c70aa6994c44d611360b7c48e5209764321b7 WatchSource:0}: Error finding container bfcd1ca1224cdb80641d46bdfc2c70aa6994c44d611360b7c48e5209764321b7: Status 404 returned error can't find the container with id bfcd1ca1224cdb80641d46bdfc2c70aa6994c44d611360b7c48e5209764321b7 Apr 22 14:15:21.211219 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:21.211046 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e1454ff_e291_42e9_8bb6_cd922139fd02.slice/crio-15ec0dd84b0379b6874b1b5cf18604c32e760fdfb60a3acabd34283f6337140a WatchSource:0}: Error finding container 15ec0dd84b0379b6874b1b5cf18604c32e760fdfb60a3acabd34283f6337140a: Status 404 returned error can't find the container with id 15ec0dd84b0379b6874b1b5cf18604c32e760fdfb60a3acabd34283f6337140a Apr 22 14:15:21.211842 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:21.211822 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a11fb65_b996_42de_a115_49420effa19b.slice/crio-c97ccee059fcf56dd0a8f84ce48d35fba9a8325514747306f9e4f0b007272a5c WatchSource:0}: Error finding container c97ccee059fcf56dd0a8f84ce48d35fba9a8325514747306f9e4f0b007272a5c: Status 404 returned error can't find the container with id c97ccee059fcf56dd0a8f84ce48d35fba9a8325514747306f9e4f0b007272a5c Apr 22 14:15:21.213001 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.212980 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:21.213152 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:21.213106 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:21.213152 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:21.213124 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:21.213152 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:21.213136 2566 projected.go:194] Error preparing data for projected volume kube-api-access-twxx2 for pod openshift-network-diagnostics/network-check-target-w9djl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:21.213302 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:21.213183 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2 podName:ab2a1f01-aab3-488d-8a5c-09e7a9568954 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:22.21316756 +0000 UTC m=+4.192678932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-twxx2" (UniqueName: "kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2") pod "network-check-target-w9djl" (UID: "ab2a1f01-aab3-488d-8a5c-09e7a9568954") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:21.543675 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.543361 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:19 +0000 UTC" deadline="2028-01-23 14:04:58.386517678 +0000 UTC" Apr 22 14:15:21.543675 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.543588 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15383h49m36.842933366s" Apr 22 14:15:21.648103 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.648064 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l8hc9" event={"ID":"bccc745a-d0a3-4d47-bb03-7502b82f4a26","Type":"ContainerStarted","Data":"67c4963cd71455e574d2f242b632286372ef9d36f20ba87559198cecfe703399"} Apr 22 14:15:21.650559 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.650496 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" event={"ID":"c9e3f13d-f48a-4ce7-a59d-16c11e660545","Type":"ContainerStarted","Data":"f3d210c5b95fe7e6a5f256c4c941130a3a920db4948a93675ccbda514bc60f7e"} Apr 22 14:15:21.662241 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.661637 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" event={"ID":"2a7add03423232b9ddbd2ad4e8a3d9c3","Type":"ContainerStarted","Data":"8bcd97716db4168aa0aebfa45ad7a2c36b09f3655fbe5021cdb6b245d3f6272e"} Apr 22 14:15:21.670036 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.669978 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rnwmp" event={"ID":"6a11fb65-b996-42de-a115-49420effa19b","Type":"ContainerStarted","Data":"c97ccee059fcf56dd0a8f84ce48d35fba9a8325514747306f9e4f0b007272a5c"} Apr 22 14:15:21.673901 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.673850 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwk64" event={"ID":"1e1454ff-e291-42e9-8bb6-cd922139fd02","Type":"ContainerStarted","Data":"15ec0dd84b0379b6874b1b5cf18604c32e760fdfb60a3acabd34283f6337140a"} Apr 22 14:15:21.675426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.675365 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-161.ec2.internal" podStartSLOduration=2.675350496 podStartE2EDuration="2.675350496s" podCreationTimestamp="2026-04-22 14:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:21.675254194 +0000 UTC m=+3.654765573" watchObservedRunningTime="2026-04-22 14:15:21.675350496 +0000 UTC m=+3.654861870" Apr 22 14:15:21.692662 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.692630 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerStarted","Data":"ffe77a847a2411e38e7e3f6250c219a11b882d0079ba1d941153abb34d9e1b84"} Apr 22 14:15:21.712828 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.712793 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" event={"ID":"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855","Type":"ContainerStarted","Data":"7df8414170f6caf8c3340b7f8802ddeea0cd938cfb44ce08be5b078e3fb222a1"} Apr 22 14:15:21.718808 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.718780 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"ba584bdb8800f6f07b41cee314ba0008a12068c0e43937984e0015d03b09da4f"} Apr 22 14:15:21.731361 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.731005 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6rsgf" event={"ID":"033bc69f-f51f-40ca-8484-1ae2dc580b53","Type":"ContainerStarted","Data":"502e1ed69bdc8c018967a7f8d23eefc69dcc97c3c3b28970cec011cbf4dff18b"} Apr 22 14:15:21.735092 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.735065 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zmgl9" event={"ID":"1c429a6e-0682-4fe6-9ec0-b39e350ccc63","Type":"ContainerStarted","Data":"bfcd1ca1224cdb80641d46bdfc2c70aa6994c44d611360b7c48e5209764321b7"} Apr 22 14:15:21.947585 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.947558 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dghbg"] Apr 22 14:15:21.951623 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:21.950582 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:21.951623 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:21.950661 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:22.018311 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.018279 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3986444-d7dd-4409-b107-157fc81b5e02-kubelet-config\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.018463 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.018319 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.018463 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.018349 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3986444-d7dd-4409-b107-157fc81b5e02-dbus\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.118679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.118746 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3986444-d7dd-4409-b107-157fc81b5e02-dbus\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.118780 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.118857 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3986444-d7dd-4409-b107-157fc81b5e02-kubelet-config\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.118947 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a3986444-d7dd-4409-b107-157fc81b5e02-kubelet-config\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.119058 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.119114 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret podName:a3986444-d7dd-4409-b107-157fc81b5e02 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:22.619096293 +0000 UTC m=+4.598607657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret") pod "global-pull-secret-syncer-dghbg" (UID: "a3986444-d7dd-4409-b107-157fc81b5e02") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.119612 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:22.119709 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.119663 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:24.119647513 +0000 UTC m=+6.099158874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:22.120305 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.120267 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a3986444-d7dd-4409-b107-157fc81b5e02-dbus\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.220073 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.219991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:22.220237 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.220190 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:22.220237 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.220215 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:22.220237 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.220229 2566 projected.go:194] Error preparing data for projected volume kube-api-access-twxx2 for pod openshift-network-diagnostics/network-check-target-w9djl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:22.220398 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.220292 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2 podName:ab2a1f01-aab3-488d-8a5c-09e7a9568954 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:24.220272283 +0000 UTC m=+6.199783643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-twxx2" (UniqueName: "kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2") pod "network-check-target-w9djl" (UID: "ab2a1f01-aab3-488d-8a5c-09e7a9568954") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:22.629777 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.629734 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:22.630243 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.629976 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:22.630243 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.630045 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret podName:a3986444-d7dd-4409-b107-157fc81b5e02 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:23.630025959 +0000 UTC m=+5.609537321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret") pod "global-pull-secret-syncer-dghbg" (UID: "a3986444-d7dd-4409-b107-157fc81b5e02") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:22.632890 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.632866 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:22.633009 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.632979 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:22.633116 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.633079 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:22.633174 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:22.633159 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:22.747891 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.746122 2566 generic.go:358] "Generic (PLEG): container finished" podID="725459ae6e0afb73747d2931979035a7" containerID="c933c8f6e017580dd8d900ff26e6ec1a9eabc76333faa54d8b61ad8a86d2c096" exitCode=0 Apr 22 14:15:22.747891 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:22.746287 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" event={"ID":"725459ae6e0afb73747d2931979035a7","Type":"ContainerDied","Data":"c933c8f6e017580dd8d900ff26e6ec1a9eabc76333faa54d8b61ad8a86d2c096"} Apr 22 14:15:23.629585 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:23.629550 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:23.629777 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:23.629696 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:23.638105 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:23.638075 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:23.638591 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:23.638267 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:23.638591 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:23.638323 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret podName:a3986444-d7dd-4409-b107-157fc81b5e02 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:25.638306729 +0000 UTC m=+7.617818085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret") pod "global-pull-secret-syncer-dghbg" (UID: "a3986444-d7dd-4409-b107-157fc81b5e02") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:23.756269 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:23.756230 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" event={"ID":"725459ae6e0afb73747d2931979035a7","Type":"ContainerStarted","Data":"d9506536eeeec63d2f23df4f452c07cf4650bd67e14da7b5565e67e068549f54"} Apr 22 14:15:23.770723 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:23.770663 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-161.ec2.internal" podStartSLOduration=4.770643805 podStartE2EDuration="4.770643805s" podCreationTimestamp="2026-04-22 14:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:23.769843182 +0000 UTC m=+5.749354563" watchObservedRunningTime="2026-04-22 14:15:23.770643805 +0000 UTC m=+5.750155185" Apr 22 14:15:24.142285 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:24.141679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:24.142285 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:24.141895 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:24.142285 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:24.141960 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:28.141941886 +0000 UTC m=+10.121453258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:24.242504 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:24.242466 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:24.242674 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:24.242660 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:24.242751 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:24.242684 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:24.242751 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:24.242697 2566 projected.go:194] Error preparing data for projected volume kube-api-access-twxx2 for pod openshift-network-diagnostics/network-check-target-w9djl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:24.242859 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:24.242770 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2 podName:ab2a1f01-aab3-488d-8a5c-09e7a9568954 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:28.242748409 +0000 UTC m=+10.222259789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-twxx2" (UniqueName: "kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2") pod "network-check-target-w9djl" (UID: "ab2a1f01-aab3-488d-8a5c-09e7a9568954") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:24.629116 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:24.628900 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:24.629116 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:24.629037 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:24.629704 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:24.629490 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:24.629704 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:24.629617 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:25.629289 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:25.629230 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:25.629795 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:25.629363 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:25.655923 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:25.655891 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:25.656085 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:25.656065 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:25.656141 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:25.656134 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret podName:a3986444-d7dd-4409-b107-157fc81b5e02 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:29.656114154 +0000 UTC m=+11.635625525 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret") pod "global-pull-secret-syncer-dghbg" (UID: "a3986444-d7dd-4409-b107-157fc81b5e02") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:26.629000 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:26.628738 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:26.629000 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:26.628867 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:26.629000 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:26.628747 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:26.629000 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:26.628967 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:27.628916 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:27.628830 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:27.629328 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:27.628961 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:28.178373 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:28.178336 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:28.178581 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:28.178508 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:28.178581 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:28.178578 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:36.178561954 +0000 UTC m=+18.158073324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:28.279687 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:28.279642 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:28.279891 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:28.279867 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:28.279891 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:28.279891 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:28.280070 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:28.279903 2566 projected.go:194] Error preparing data for projected volume kube-api-access-twxx2 for pod openshift-network-diagnostics/network-check-target-w9djl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:28.280070 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:28.279968 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2 podName:ab2a1f01-aab3-488d-8a5c-09e7a9568954 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:36.279949678 +0000 UTC m=+18.259461058 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-twxx2" (UniqueName: "kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2") pod "network-check-target-w9djl" (UID: "ab2a1f01-aab3-488d-8a5c-09e7a9568954") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:28.630766 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:28.630251 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:28.630766 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:28.630374 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:28.630766 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:28.630416 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:28.630766 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:28.630562 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:29.629100 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:29.629067 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:29.629261 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:29.629179 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:29.691230 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:29.691197 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:29.691629 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:29.691368 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:29.691629 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:29.691463 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret podName:a3986444-d7dd-4409-b107-157fc81b5e02 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:37.691444873 +0000 UTC m=+19.670956245 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret") pod "global-pull-secret-syncer-dghbg" (UID: "a3986444-d7dd-4409-b107-157fc81b5e02") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:30.629357 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:30.629322 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:30.629570 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:30.629322 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:30.629570 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:30.629468 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:30.629703 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:30.629578 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:31.628931 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:31.628894 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:31.629329 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:31.629014 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:32.629548 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:32.629513 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:32.630011 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:32.629627 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:32.630011 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:32.629645 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:32.630011 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:32.629723 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:33.629657 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:33.629626 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:33.630087 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:33.629738 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:34.629200 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:34.629162 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:34.629389 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:34.629213 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:34.629389 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:34.629333 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:34.629526 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:34.629495 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:35.629447 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:35.629397 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:35.629907 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:35.629539 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:36.237390 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:36.237352 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:36.237590 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:36.237494 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:36.237590 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:36.237567 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:52.237547615 +0000 UTC m=+34.217058974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:36.338156 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:36.338117 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:36.338326 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:36.338295 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:36.338326 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:36.338316 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:36.338326 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:36.338326 2566 projected.go:194] Error preparing data for projected volume kube-api-access-twxx2 for pod openshift-network-diagnostics/network-check-target-w9djl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:36.338495 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:36.338385 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2 podName:ab2a1f01-aab3-488d-8a5c-09e7a9568954 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:52.338368208 +0000 UTC m=+34.317879575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-twxx2" (UniqueName: "kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2") pod "network-check-target-w9djl" (UID: "ab2a1f01-aab3-488d-8a5c-09e7a9568954") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:36.628908 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:36.628874 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:36.629089 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:36.629011 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:36.629089 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:36.629059 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:36.629203 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:36.629144 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:37.628742 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:37.628703 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:37.629116 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:37.628811 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:37.747513 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:37.747473 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:37.747703 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:37.747681 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:37.747785 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:37.747774 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret podName:a3986444-d7dd-4409-b107-157fc81b5e02 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:53.747756076 +0000 UTC m=+35.727267436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret") pod "global-pull-secret-syncer-dghbg" (UID: "a3986444-d7dd-4409-b107-157fc81b5e02") : object "kube-system"/"original-pull-secret" not registered Apr 22 14:15:38.631458 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.631143 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:38.631885 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:38.631549 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:38.632093 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.632064 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:38.632247 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:38.632226 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:38.782935 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.782903 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rnwmp" event={"ID":"6a11fb65-b996-42de-a115-49420effa19b","Type":"ContainerStarted","Data":"e968e6d9f780526d0f3ad905b15341990949881213ae8ef00ad2ff3f70fd7637"} Apr 22 14:15:38.784415 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.784388 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwk64" event={"ID":"1e1454ff-e291-42e9-8bb6-cd922139fd02","Type":"ContainerStarted","Data":"37ae5037035544e296e01948268e31b1318e471336a33e0fe9f8d3f785cf9ea1"} Apr 22 14:15:38.785893 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.785868 2566 generic.go:358] "Generic (PLEG): container finished" podID="57899f5d-95c9-4f88-8a37-538507647859" containerID="f62b2913289f878941d0188dd0c5f70d711709e9b071f57387c40d94b2c55b56" exitCode=0 Apr 22 14:15:38.785994 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.785926 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerDied","Data":"f62b2913289f878941d0188dd0c5f70d711709e9b071f57387c40d94b2c55b56"} Apr 22 14:15:38.787286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.787260 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" event={"ID":"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855","Type":"ContainerStarted","Data":"8be399e1c875cf0793248b3d7253a86fcbcb62b8e4b854ac251e153d80ed0630"} Apr 22 14:15:38.789893 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.789876 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:15:38.790239 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.790215 2566 generic.go:358] "Generic (PLEG): container finished" podID="d37f6164-ab7b-4939-a74e-19ab726827bb" containerID="1c6edb34a6ae5ecdbddd04a1344fff6ec6debac85fa8d68ce333310b568d08d1" exitCode=1 Apr 22 14:15:38.790334 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.790244 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"b6bdd3b0f5b9a1ff03f6e8b33f7c9b41efbd17a3672f198ce122b691f0602d79"} Apr 22 14:15:38.790334 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.790276 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"f851ec6d4ecc46f1d36c4febc3e6d8e8f307132b2cf5f49cbb52d9abcbdf5eb7"} Apr 22 14:15:38.790334 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.790294 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerDied","Data":"1c6edb34a6ae5ecdbddd04a1344fff6ec6debac85fa8d68ce333310b568d08d1"} Apr 22 14:15:38.790334 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.790307 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"07f9a9b4a15fb6b5426cd59787e32a2f14b2af3e23ac602b394c3e31a8206df7"} Apr 22 14:15:38.791492 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.791473 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6rsgf" event={"ID":"033bc69f-f51f-40ca-8484-1ae2dc580b53","Type":"ContainerStarted","Data":"f1180df6cc80a2974e6d2ad153aae0177e338e32eac01846c0a07adc3b6630e0"} Apr 22 14:15:38.792726 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.792695 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zmgl9" event={"ID":"1c429a6e-0682-4fe6-9ec0-b39e350ccc63","Type":"ContainerStarted","Data":"6e1d41b83e94a44e8d1aa8a6a58a5fb1c88cb0e2cd25fe2b72b226c65e0ddf7b"} Apr 22 14:15:38.793905 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.793884 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" event={"ID":"c9e3f13d-f48a-4ce7-a59d-16c11e660545","Type":"ContainerStarted","Data":"f718782cd990553f9e5cd71c78a168a2f2a5a8052301545070df49dc6dd869ea"} Apr 22 14:15:38.797078 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.797038 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rnwmp" podStartSLOduration=3.882371943 podStartE2EDuration="20.797023069s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.213628233 +0000 UTC m=+3.193139595" lastFinishedPulling="2026-04-22 14:15:38.128279355 +0000 UTC m=+20.107790721" observedRunningTime="2026-04-22 14:15:38.79635564 +0000 UTC m=+20.775867019" watchObservedRunningTime="2026-04-22 14:15:38.797023069 +0000 UTC m=+20.776534450" Apr 22 14:15:38.808666 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.808620 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6rsgf" podStartSLOduration=3.8903493510000002 podStartE2EDuration="20.808604287s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.20998004 +0000 UTC m=+3.189491396" lastFinishedPulling="2026-04-22 14:15:38.128234962 +0000 UTC m=+20.107746332" observedRunningTime="2026-04-22 14:15:38.808581522 +0000 UTC m=+20.788092902" watchObservedRunningTime="2026-04-22 14:15:38.808604287 +0000 UTC m=+20.788115664" Apr 22 14:15:38.822171 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.822127 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4l7kw" podStartSLOduration=3.861519236 podStartE2EDuration="20.822114408s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.191405279 +0000 UTC m=+3.170916636" lastFinishedPulling="2026-04-22 14:15:38.152000436 +0000 UTC m=+20.131511808" observedRunningTime="2026-04-22 14:15:38.821670201 +0000 UTC m=+20.801181584" watchObservedRunningTime="2026-04-22 14:15:38.822114408 +0000 UTC m=+20.801625786" Apr 22 14:15:38.833289 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:38.833253 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zmgl9" podStartSLOduration=2.891055273 podStartE2EDuration="19.833241926s" podCreationTimestamp="2026-04-22 14:15:19 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.209784884 +0000 UTC m=+3.189296242" lastFinishedPulling="2026-04-22 14:15:38.151971523 +0000 UTC m=+20.131482895" observedRunningTime="2026-04-22 14:15:38.832849925 +0000 UTC m=+20.812361316" watchObservedRunningTime="2026-04-22 14:15:38.833241926 +0000 UTC m=+20.812753305" Apr 22 14:15:39.629728 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:39.629698 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:39.629955 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:39.629844 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:39.798851 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:39.798830 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:15:39.799236 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:39.799210 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"36aae735c3f214719a0c880556161134445d1151af28727a9d8d99a856d518c2"} Apr 22 14:15:39.799297 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:39.799246 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"60887c64a2f71df28b33ffd103345828f96ca20f9071990f990009050404c949"} Apr 22 14:15:39.800617 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:39.800587 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l8hc9" event={"ID":"bccc745a-d0a3-4d47-bb03-7502b82f4a26","Type":"ContainerStarted","Data":"ec3780e1a03da300fb767427d0bf033d8556200e757e5913051952a0b61446c0"} Apr 22 14:15:39.846662 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:39.846602 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nwk64" podStartSLOduration=4.871462955 podStartE2EDuration="21.846583698s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.212928281 +0000 UTC m=+3.192439649" lastFinishedPulling="2026-04-22 14:15:38.188049036 +0000 UTC m=+20.167560392" observedRunningTime="2026-04-22 14:15:38.880648801 +0000 UTC m=+20.860160180" watchObservedRunningTime="2026-04-22 14:15:39.846583698 +0000 UTC m=+21.826095079" Apr 22 14:15:39.846973 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:39.846937 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-l8hc9" podStartSLOduration=4.8887295779999995 podStartE2EDuration="21.84692729s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.193773106 +0000 UTC m=+3.173284466" lastFinishedPulling="2026-04-22 14:15:38.151970813 +0000 UTC m=+20.131482178" observedRunningTime="2026-04-22 14:15:39.846285412 +0000 UTC m=+21.825796795" watchObservedRunningTime="2026-04-22 14:15:39.84692729 +0000 UTC m=+21.826438670" Apr 22 14:15:39.960519 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:39.960397 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:40.570950 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:40.570854 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:39.960421519Z","UUID":"ece34cd7-29e0-4f67-bc07-73bdf10f4d35","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:40.572647 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:40.572621 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:40.572647 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:40.572651 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:40.628911 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:40.628876 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:40.629088 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:40.628876 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:40.629088 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:40.629006 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:40.629088 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:40.629061 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:40.805583 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:40.805543 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" event={"ID":"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855","Type":"ContainerStarted","Data":"e31c6021e2c1aed25132b9fc22a675fcb1dfbd9a1a4df6dc2e100ec14fd8c784"} Apr 22 14:15:41.629796 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:41.629614 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:41.629956 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:41.629883 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:41.795563 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:41.795518 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:41.796240 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:41.796219 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:41.807511 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:41.807480 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:41.808061 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:41.808039 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rnwmp" Apr 22 14:15:42.629193 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:42.629163 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:42.629397 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:42.629292 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:42.629397 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:42.629341 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:42.629534 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:42.629479 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:43.629360 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:43.629322 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:43.629954 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:43.629444 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:43.814964 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:43.814868 2566 generic.go:358] "Generic (PLEG): container finished" podID="57899f5d-95c9-4f88-8a37-538507647859" containerID="08cdb26ab782c5a069cf65af7cf3085e2e137919ce28cd7be1c2a3cbf03b7c4f" exitCode=0 Apr 22 14:15:43.815115 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:43.814959 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerDied","Data":"08cdb26ab782c5a069cf65af7cf3085e2e137919ce28cd7be1c2a3cbf03b7c4f"} Apr 22 14:15:43.817466 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:43.817426 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" event={"ID":"549a6cb6-40be-4bb2-9a43-ba2a9d5ea855","Type":"ContainerStarted","Data":"e2ff3db37603ce9b37ca8f3c4897c826064e0ffe2024ef21ff837d1ea9b2f94e"} Apr 22 14:15:43.820170 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:43.820152 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:15:43.820617 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:43.820587 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"f6aa3ce57c6ca66490d29ae79aa315618fa4acf05b0a82f064bb47cf8aa9576c"} Apr 22 14:15:43.852895 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:43.852851 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jm7xn" podStartSLOduration=5.859254107 podStartE2EDuration="25.852838097s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.198578075 +0000 UTC m=+3.178089440" lastFinishedPulling="2026-04-22 14:15:41.19216206 +0000 UTC m=+23.171673430" observedRunningTime="2026-04-22 14:15:43.852589839 +0000 UTC m=+25.832101218" watchObservedRunningTime="2026-04-22 14:15:43.852838097 +0000 UTC m=+25.832349476" Apr 22 14:15:44.628676 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:44.628655 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:44.628768 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:44.628751 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:44.628807 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:44.628790 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:44.628856 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:44.628842 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:44.823943 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:44.823862 2566 generic.go:358] "Generic (PLEG): container finished" podID="57899f5d-95c9-4f88-8a37-538507647859" containerID="8fb5ff71cd02035ad4ec808661725e8f2898202bc4e6384f58861ec2f112770f" exitCode=0 Apr 22 14:15:44.824358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:44.823956 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerDied","Data":"8fb5ff71cd02035ad4ec808661725e8f2898202bc4e6384f58861ec2f112770f"} Apr 22 14:15:45.629400 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.629311 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:45.629585 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:45.629448 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:45.828143 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.828102 2566 generic.go:358] "Generic (PLEG): container finished" podID="57899f5d-95c9-4f88-8a37-538507647859" containerID="08d31f7469bd1e92624d65bd0fce952483a3497353f283df125a33695153ee28" exitCode=0 Apr 22 14:15:45.828709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.828189 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerDied","Data":"08d31f7469bd1e92624d65bd0fce952483a3497353f283df125a33695153ee28"} Apr 22 14:15:45.831181 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.831163 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:15:45.831490 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.831459 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"0def4e687c6d89d6c42bd129373355a76d5b8c48b0f6d3af4fc14951107d7f9d"} Apr 22 14:15:45.831722 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.831705 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:45.831825 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.831728 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:45.831968 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.831949 2566 scope.go:117] "RemoveContainer" containerID="1c6edb34a6ae5ecdbddd04a1344fff6ec6debac85fa8d68ce333310b568d08d1" Apr 22 14:15:45.846388 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:45.846365 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:46.629523 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.628852 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:46.629523 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:46.629004 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:46.629523 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.629352 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:46.629523 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:46.629472 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:46.836653 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.836626 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:15:46.837085 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.836972 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" event={"ID":"d37f6164-ab7b-4939-a74e-19ab726827bb","Type":"ContainerStarted","Data":"76937508ce865e6e6981781d01ea7c19e4d0bd542343bc45c0e855060d7d752e"} Apr 22 14:15:46.837272 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.837250 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:46.854706 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.854680 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:15:46.879789 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.879681 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" podStartSLOduration=11.853028999 podStartE2EDuration="28.879668085s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.193573716 +0000 UTC m=+3.173085077" lastFinishedPulling="2026-04-22 14:15:38.220212802 +0000 UTC m=+20.199724163" observedRunningTime="2026-04-22 14:15:46.879599757 +0000 UTC m=+28.859111136" watchObservedRunningTime="2026-04-22 14:15:46.879668085 +0000 UTC m=+28.859179464" Apr 22 14:15:46.904334 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.904301 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dghbg"] Apr 22 14:15:46.904530 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.904477 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:46.904716 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:46.904610 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:46.909270 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.909243 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9rgrl"] Apr 22 14:15:46.909401 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.909386 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:46.909525 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:46.909507 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:46.916713 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.916229 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w9djl"] Apr 22 14:15:46.916713 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:46.916358 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:46.916713 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:46.916509 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:48.630535 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:48.630278 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:48.631019 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:48.630346 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:48.631019 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:48.630634 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghbg" podUID="a3986444-d7dd-4409-b107-157fc81b5e02" Apr 22 14:15:48.631019 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:48.630716 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w9djl" podUID="ab2a1f01-aab3-488d-8a5c-09e7a9568954" Apr 22 14:15:48.631019 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:48.630367 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:48.631019 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:48.630823 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:15:50.390746 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.390717 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-161.ec2.internal" event="NodeReady" Apr 22 14:15:50.391322 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.390876 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:15:50.426805 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.426764 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-779f98969c-lk4m8"] Apr 22 14:15:50.455600 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.455569 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779f98969c-lk4m8"] Apr 22 14:15:50.455600 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.455597 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-f4zzk"] Apr 22 14:15:50.455793 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.455748 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.459012 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.458961 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:15:50.459012 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.459007 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:15:50.459224 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.459030 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:15:50.459224 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.459158 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xbchn\"" Apr 22 14:15:50.466070 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.466048 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:15:50.469609 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.469580 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-22qt9"] Apr 22 14:15:50.469741 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.469717 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:50.472660 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.472633 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:15:50.472784 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.472702 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-slf7l\"" Apr 22 14:15:50.472784 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.472703 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:15:50.472981 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.472962 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:15:50.496370 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.496332 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-22qt9"] Apr 22 14:15:50.496370 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.496366 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f4zzk"] Apr 22 14:15:50.496574 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.496524 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.499621 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.499603 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fpnmw\"" Apr 22 14:15:50.499744 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.499684 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:15:50.499806 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.499759 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:15:50.545011 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.544977 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-ca-trust-extracted\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.545011 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545015 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2xwv\" (UniqueName: \"kubernetes.io/projected/3da3d9ed-4865-4d4d-a429-13417afb99df-kube-api-access-x2xwv\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:50.545206 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545137 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-installation-pull-secrets\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.545206 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545197 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.545273 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545223 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-trusted-ca\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.545273 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545251 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cld6j\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-kube-api-access-cld6j\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.545370 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545315 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-image-registry-private-configuration\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.545370 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545353 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:50.545469 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545393 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-certificates\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.545469 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.545418 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-bound-sa-token\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.628806 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.628769 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:50.628985 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.628808 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:50.628985 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.628866 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:50.631717 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.631685 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:15:50.631717 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.631686 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-r76nx\"" Apr 22 14:15:50.631902 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.631810 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:15:50.631902 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.631851 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:15:50.632139 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.632083 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-77xtv\"" Apr 22 14:15:50.632270 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.632145 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:15:50.632976 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.632898 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn"] Apr 22 14:15:50.645914 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.645892 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-installation-pull-secrets\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.646011 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.645928 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.646011 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.645956 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-trusted-ca\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.646011 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.645982 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cld6j\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-kube-api-access-cld6j\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.646168 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646011 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-image-registry-private-configuration\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.646168 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646042 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d568b082-5592-4166-8e56-4b3f5d03022f-config-volume\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.646168 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:50.646059 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:50.646168 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:50.646076 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779f98969c-lk4m8: secret "image-registry-tls" not found Apr 22 14:15:50.646168 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646075 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:50.646168 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646100 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.646168 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646124 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d568b082-5592-4166-8e56-4b3f5d03022f-tmp-dir\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.646168 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:50.646146 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls podName:0160c8de-74a1-4adf-854c-2efcb3f7ab8e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:51.146123027 +0000 UTC m=+33.125634387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls") pod "image-registry-779f98969c-lk4m8" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e") : secret "image-registry-tls" not found Apr 22 14:15:50.646544 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:50.646362 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:50.646544 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:50.646451 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert podName:3da3d9ed-4865-4d4d-a429-13417afb99df nodeName:}" failed. No retries permitted until 2026-04-22 14:15:51.146413005 +0000 UTC m=+33.125924366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert") pod "ingress-canary-f4zzk" (UID: "3da3d9ed-4865-4d4d-a429-13417afb99df") : secret "canary-serving-cert" not found Apr 22 14:15:50.646544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646513 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvkv2\" (UniqueName: \"kubernetes.io/projected/d568b082-5592-4166-8e56-4b3f5d03022f-kube-api-access-gvkv2\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.646681 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646550 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-certificates\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.646681 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646576 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-bound-sa-token\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.646681 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646603 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-ca-trust-extracted\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.646681 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646628 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2xwv\" (UniqueName: \"kubernetes.io/projected/3da3d9ed-4865-4d4d-a429-13417afb99df-kube-api-access-x2xwv\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:50.646970 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.646945 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-trusted-ca\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.647047 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.647033 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-ca-trust-extracted\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.647367 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.647320 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-certificates\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.651202 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.651175 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-image-registry-private-configuration\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.655736 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.655713 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn"] Apr 22 14:15:50.655925 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.655887 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" Apr 22 14:15:50.657592 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.657570 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cld6j\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-kube-api-access-cld6j\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.658377 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.658355 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2xwv\" (UniqueName: \"kubernetes.io/projected/3da3d9ed-4865-4d4d-a429-13417afb99df-kube-api-access-x2xwv\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:50.658512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.658488 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-bound-sa-token\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.658718 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.658687 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 14:15:50.659058 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.659039 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-zp9pr\"" Apr 22 14:15:50.659130 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.659039 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 14:15:50.659130 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.659039 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 14:15:50.659257 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.659040 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 14:15:50.661697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.661679 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-installation-pull-secrets\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:50.684694 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.684666 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb"] Apr 22 14:15:50.713728 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.713668 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb"] Apr 22 14:15:50.713728 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.713702 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs"] Apr 22 14:15:50.713946 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.713820 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.716667 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.716643 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 14:15:50.716667 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.716669 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 14:15:50.716837 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.716712 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 14:15:50.716837 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.716651 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 14:15:50.728171 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.728147 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs"] Apr 22 14:15:50.728283 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.728267 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.730801 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.730781 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 14:15:50.747878 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.747846 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d568b082-5592-4166-8e56-4b3f5d03022f-config-volume\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.748021 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.747895 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10eb118e-4820-4012-9cec-b6bf9fd92cb6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-644884c4c7-cnxnn\" (UID: \"10eb118e-4820-4012-9cec-b6bf9fd92cb6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" Apr 22 14:15:50.748021 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.747933 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.748021 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.747958 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d568b082-5592-4166-8e56-4b3f5d03022f-tmp-dir\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.748169 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.748040 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvkv2\" (UniqueName: \"kubernetes.io/projected/d568b082-5592-4166-8e56-4b3f5d03022f-kube-api-access-gvkv2\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.748169 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.748086 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44h4w\" (UniqueName: \"kubernetes.io/projected/10eb118e-4820-4012-9cec-b6bf9fd92cb6-kube-api-access-44h4w\") pod \"managed-serviceaccount-addon-agent-644884c4c7-cnxnn\" (UID: \"10eb118e-4820-4012-9cec-b6bf9fd92cb6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" Apr 22 14:15:50.748169 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:50.748144 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:50.748303 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:50.748199 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls podName:d568b082-5592-4166-8e56-4b3f5d03022f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:51.248179677 +0000 UTC m=+33.227691049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls") pod "dns-default-22qt9" (UID: "d568b082-5592-4166-8e56-4b3f5d03022f") : secret "dns-default-metrics-tls" not found Apr 22 14:15:50.748399 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.748382 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d568b082-5592-4166-8e56-4b3f5d03022f-tmp-dir\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.748697 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.748539 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d568b082-5592-4166-8e56-4b3f5d03022f-config-volume\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.761293 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.761272 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvkv2\" (UniqueName: \"kubernetes.io/projected/d568b082-5592-4166-8e56-4b3f5d03022f-kube-api-access-gvkv2\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:50.849304 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849261 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.849462 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6731ebe9-2efe-4307-a635-926bb91326dd-tmp\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.849462 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849382 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rtrk\" (UniqueName: \"kubernetes.io/projected/6731ebe9-2efe-4307-a635-926bb91326dd-kube-api-access-9rtrk\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.849462 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849419 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10eb118e-4820-4012-9cec-b6bf9fd92cb6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-644884c4c7-cnxnn\" (UID: \"10eb118e-4820-4012-9cec-b6bf9fd92cb6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" Apr 22 14:15:50.849462 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849456 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-ca\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.849652 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849477 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.849652 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849537 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-hub\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.849652 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849571 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44h4w\" (UniqueName: \"kubernetes.io/projected/10eb118e-4820-4012-9cec-b6bf9fd92cb6-kube-api-access-44h4w\") pod \"managed-serviceaccount-addon-agent-644884c4c7-cnxnn\" (UID: \"10eb118e-4820-4012-9cec-b6bf9fd92cb6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" Apr 22 14:15:50.849652 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849589 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6731ebe9-2efe-4307-a635-926bb91326dd-klusterlet-config\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.849652 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849612 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.849652 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.849651 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwdw\" (UniqueName: \"kubernetes.io/projected/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-kube-api-access-ccwdw\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.852067 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.852043 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10eb118e-4820-4012-9cec-b6bf9fd92cb6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-644884c4c7-cnxnn\" (UID: \"10eb118e-4820-4012-9cec-b6bf9fd92cb6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" Apr 22 14:15:50.858757 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.858726 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44h4w\" (UniqueName: \"kubernetes.io/projected/10eb118e-4820-4012-9cec-b6bf9fd92cb6-kube-api-access-44h4w\") pod \"managed-serviceaccount-addon-agent-644884c4c7-cnxnn\" (UID: \"10eb118e-4820-4012-9cec-b6bf9fd92cb6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" Apr 22 14:15:50.950564 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.950476 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-hub\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.950564 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.950516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6731ebe9-2efe-4307-a635-926bb91326dd-klusterlet-config\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.950564 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.950547 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.950811 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.950760 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwdw\" (UniqueName: \"kubernetes.io/projected/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-kube-api-access-ccwdw\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.950874 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.950850 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.950927 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.950894 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6731ebe9-2efe-4307-a635-926bb91326dd-tmp\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.950980 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.950950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rtrk\" (UniqueName: \"kubernetes.io/projected/6731ebe9-2efe-4307-a635-926bb91326dd-kube-api-access-9rtrk\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.951015 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.951000 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-ca\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.951063 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.951026 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.951473 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.951429 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6731ebe9-2efe-4307-a635-926bb91326dd-tmp\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.951819 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.951774 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.953663 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.953636 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.953757 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.953682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6731ebe9-2efe-4307-a635-926bb91326dd-klusterlet-config\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.953757 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.953726 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-hub\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.953869 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.953808 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-ca\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.954122 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.954105 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.959056 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.959036 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwdw\" (UniqueName: \"kubernetes.io/projected/be5941d8-31ae-4b55-ab1f-4fe7c679b8c8-kube-api-access-ccwdw\") pod \"cluster-proxy-proxy-agent-5874c5788b-4jfgb\" (UID: \"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:50.959277 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.959256 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rtrk\" (UniqueName: \"kubernetes.io/projected/6731ebe9-2efe-4307-a635-926bb91326dd-kube-api-access-9rtrk\") pod \"klusterlet-addon-workmgr-7849bfffb6-66jvs\" (UID: \"6731ebe9-2efe-4307-a635-926bb91326dd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:50.987400 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:50.987361 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" Apr 22 14:15:51.025359 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.025328 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:15:51.057279 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.057242 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:51.152941 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.152894 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:51.153124 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.152956 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:51.153124 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:51.153088 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:51.153235 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:51.153133 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:51.153235 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:51.153147 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779f98969c-lk4m8: secret "image-registry-tls" not found Apr 22 14:15:51.153235 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:51.153173 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert podName:3da3d9ed-4865-4d4d-a429-13417afb99df nodeName:}" failed. No retries permitted until 2026-04-22 14:15:52.153153804 +0000 UTC m=+34.132665172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert") pod "ingress-canary-f4zzk" (UID: "3da3d9ed-4865-4d4d-a429-13417afb99df") : secret "canary-serving-cert" not found Apr 22 14:15:51.153235 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:51.153194 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls podName:0160c8de-74a1-4adf-854c-2efcb3f7ab8e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:52.153183126 +0000 UTC m=+34.132694483 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls") pod "image-registry-779f98969c-lk4m8" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e") : secret "image-registry-tls" not found Apr 22 14:15:51.254238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.254157 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:51.254398 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:51.254343 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:51.254477 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:51.254422 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls podName:d568b082-5592-4166-8e56-4b3f5d03022f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:52.254402026 +0000 UTC m=+34.233913387 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls") pod "dns-default-22qt9" (UID: "d568b082-5592-4166-8e56-4b3f5d03022f") : secret "dns-default-metrics-tls" not found Apr 22 14:15:51.434757 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.433980 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs"] Apr 22 14:15:51.438232 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.438190 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb"] Apr 22 14:15:51.439113 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.439023 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn"] Apr 22 14:15:51.524271 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:51.524175 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10eb118e_4820_4012_9cec_b6bf9fd92cb6.slice/crio-dd14e094f39eae94f290b090191ba7a6e735b1ff3f6c5c7a18ea40aa27e754f5 WatchSource:0}: Error finding container dd14e094f39eae94f290b090191ba7a6e735b1ff3f6c5c7a18ea40aa27e754f5: Status 404 returned error can't find the container with id dd14e094f39eae94f290b090191ba7a6e735b1ff3f6c5c7a18ea40aa27e754f5 Apr 22 14:15:51.524614 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:51.524586 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6731ebe9_2efe_4307_a635_926bb91326dd.slice/crio-61594b31ad060ebeabb118f55c27f1f0ddec02a1bdce0da56446675de41d95fd WatchSource:0}: Error finding container 61594b31ad060ebeabb118f55c27f1f0ddec02a1bdce0da56446675de41d95fd: Status 404 returned error can't find the container with id 61594b31ad060ebeabb118f55c27f1f0ddec02a1bdce0da56446675de41d95fd Apr 22 14:15:51.525350 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:51.525329 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe5941d8_31ae_4b55_ab1f_4fe7c679b8c8.slice/crio-5066b447faab2c0f8e56901efb076460baf7f070490356e514e1ae386d50ce11 WatchSource:0}: Error finding container 5066b447faab2c0f8e56901efb076460baf7f070490356e514e1ae386d50ce11: Status 404 returned error can't find the container with id 5066b447faab2c0f8e56901efb076460baf7f070490356e514e1ae386d50ce11 Apr 22 14:15:51.848490 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.848458 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" event={"ID":"10eb118e-4820-4012-9cec-b6bf9fd92cb6","Type":"ContainerStarted","Data":"dd14e094f39eae94f290b090191ba7a6e735b1ff3f6c5c7a18ea40aa27e754f5"} Apr 22 14:15:51.849610 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.849587 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" event={"ID":"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8","Type":"ContainerStarted","Data":"5066b447faab2c0f8e56901efb076460baf7f070490356e514e1ae386d50ce11"} Apr 22 14:15:51.852381 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.852353 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerStarted","Data":"77bbbd73ea8388e8e8780357a43138c76fb7c29ab65eeab8811bbeec820b702b"} Apr 22 14:15:51.853394 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:51.853363 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" event={"ID":"6731ebe9-2efe-4307-a635-926bb91326dd","Type":"ContainerStarted","Data":"61594b31ad060ebeabb118f55c27f1f0ddec02a1bdce0da56446675de41d95fd"} Apr 22 14:15:52.163787 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.163746 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:52.164078 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.163810 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:52.164078 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.163979 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:52.164078 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.164043 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert podName:3da3d9ed-4865-4d4d-a429-13417afb99df nodeName:}" failed. No retries permitted until 2026-04-22 14:15:54.164023397 +0000 UTC m=+36.143534757 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert") pod "ingress-canary-f4zzk" (UID: "3da3d9ed-4865-4d4d-a429-13417afb99df") : secret "canary-serving-cert" not found Apr 22 14:15:52.164494 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.164469 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:52.164494 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.164491 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779f98969c-lk4m8: secret "image-registry-tls" not found Apr 22 14:15:52.164691 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.164543 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls podName:0160c8de-74a1-4adf-854c-2efcb3f7ab8e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:54.164526385 +0000 UTC m=+36.144037756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls") pod "image-registry-779f98969c-lk4m8" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e") : secret "image-registry-tls" not found Apr 22 14:15:52.265133 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.265093 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:52.265414 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.265182 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:15:52.265414 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.265320 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:15:52.265414 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.265381 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:24.265363171 +0000 UTC m=+66.244874533 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : secret "metrics-daemon-secret" not found Apr 22 14:15:52.265726 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.265707 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:52.265791 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:52.265762 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls podName:d568b082-5592-4166-8e56-4b3f5d03022f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:54.265746538 +0000 UTC m=+36.245257900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls") pod "dns-default-22qt9" (UID: "d568b082-5592-4166-8e56-4b3f5d03022f") : secret "dns-default-metrics-tls" not found Apr 22 14:15:52.367023 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.366033 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:52.374327 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.374142 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twxx2\" (UniqueName: \"kubernetes.io/projected/ab2a1f01-aab3-488d-8a5c-09e7a9568954-kube-api-access-twxx2\") pod \"network-check-target-w9djl\" (UID: \"ab2a1f01-aab3-488d-8a5c-09e7a9568954\") " pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:52.441443 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.441337 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:52.627956 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.627866 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w9djl"] Apr 22 14:15:52.634096 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:52.634062 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2a1f01_aab3_488d_8a5c_09e7a9568954.slice/crio-988525e05e3ac1c97f7039b7e27c1399e11c53a5c2cbf837b975cf276ef3d045 WatchSource:0}: Error finding container 988525e05e3ac1c97f7039b7e27c1399e11c53a5c2cbf837b975cf276ef3d045: Status 404 returned error can't find the container with id 988525e05e3ac1c97f7039b7e27c1399e11c53a5c2cbf837b975cf276ef3d045 Apr 22 14:15:52.862140 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.862086 2566 generic.go:358] "Generic (PLEG): container finished" podID="57899f5d-95c9-4f88-8a37-538507647859" containerID="77bbbd73ea8388e8e8780357a43138c76fb7c29ab65eeab8811bbeec820b702b" exitCode=0 Apr 22 14:15:52.862318 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.862200 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerDied","Data":"77bbbd73ea8388e8e8780357a43138c76fb7c29ab65eeab8811bbeec820b702b"} Apr 22 14:15:52.865569 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:52.865520 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w9djl" event={"ID":"ab2a1f01-aab3-488d-8a5c-09e7a9568954","Type":"ContainerStarted","Data":"988525e05e3ac1c97f7039b7e27c1399e11c53a5c2cbf837b975cf276ef3d045"} Apr 22 14:15:53.782447 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:53.782394 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:53.789758 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:53.789692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a3986444-d7dd-4409-b107-157fc81b5e02-original-pull-secret\") pod \"global-pull-secret-syncer-dghbg\" (UID: \"a3986444-d7dd-4409-b107-157fc81b5e02\") " pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:53.875535 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:53.874617 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerStarted","Data":"8dc141a5ed551f9fea3ee8fec1d5870b356f896d4ecc935f6e12fb07417fd662"} Apr 22 14:15:53.948998 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:53.948648 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghbg" Apr 22 14:15:54.187786 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:54.187340 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:54.187786 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:54.187404 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:54.187786 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:54.187618 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:54.187786 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:54.187620 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:54.187786 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:54.187646 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779f98969c-lk4m8: secret "image-registry-tls" not found Apr 22 14:15:54.187786 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:54.187688 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert podName:3da3d9ed-4865-4d4d-a429-13417afb99df nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.18766826 +0000 UTC m=+40.167179621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert") pod "ingress-canary-f4zzk" (UID: "3da3d9ed-4865-4d4d-a429-13417afb99df") : secret "canary-serving-cert" not found Apr 22 14:15:54.187786 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:54.187707 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls podName:0160c8de-74a1-4adf-854c-2efcb3f7ab8e nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.187697656 +0000 UTC m=+40.167209015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls") pod "image-registry-779f98969c-lk4m8" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e") : secret "image-registry-tls" not found Apr 22 14:15:54.288037 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:54.287990 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:54.288231 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:54.288216 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:54.288300 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:54.288290 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls podName:d568b082-5592-4166-8e56-4b3f5d03022f nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.288270474 +0000 UTC m=+40.267781845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls") pod "dns-default-22qt9" (UID: "d568b082-5592-4166-8e56-4b3f5d03022f") : secret "dns-default-metrics-tls" not found Apr 22 14:15:54.880098 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:54.880051 2566 generic.go:358] "Generic (PLEG): container finished" podID="57899f5d-95c9-4f88-8a37-538507647859" containerID="8dc141a5ed551f9fea3ee8fec1d5870b356f896d4ecc935f6e12fb07417fd662" exitCode=0 Apr 22 14:15:54.880561 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:54.880110 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerDied","Data":"8dc141a5ed551f9fea3ee8fec1d5870b356f896d4ecc935f6e12fb07417fd662"} Apr 22 14:15:58.222709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:58.222669 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:15:58.223253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:58.222720 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:15:58.223253 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:58.222818 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:58.223253 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:58.222824 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:15:58.223253 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:58.222844 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779f98969c-lk4m8: secret "image-registry-tls" not found Apr 22 14:15:58.223253 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:58.222868 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert podName:3da3d9ed-4865-4d4d-a429-13417afb99df nodeName:}" failed. No retries permitted until 2026-04-22 14:16:06.222855086 +0000 UTC m=+48.202366443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert") pod "ingress-canary-f4zzk" (UID: "3da3d9ed-4865-4d4d-a429-13417afb99df") : secret "canary-serving-cert" not found Apr 22 14:15:58.223253 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:58.222892 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls podName:0160c8de-74a1-4adf-854c-2efcb3f7ab8e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:06.222875943 +0000 UTC m=+48.202387317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls") pod "image-registry-779f98969c-lk4m8" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e") : secret "image-registry-tls" not found Apr 22 14:15:58.323907 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:58.323868 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:15:58.324095 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:58.324063 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:58.324159 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:15:58.324151 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls podName:d568b082-5592-4166-8e56-4b3f5d03022f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:06.324128964 +0000 UTC m=+48.303640346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls") pod "dns-default-22qt9" (UID: "d568b082-5592-4166-8e56-4b3f5d03022f") : secret "dns-default-metrics-tls" not found Apr 22 14:15:58.915770 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:58.915734 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dghbg"] Apr 22 14:15:58.920169 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:15:58.920136 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3986444_d7dd_4409_b107_157fc81b5e02.slice/crio-5ce81bf82999359ff2a8e6a4f50fc583dff117f85e61822a96834c98d31c29e0 WatchSource:0}: Error finding container 5ce81bf82999359ff2a8e6a4f50fc583dff117f85e61822a96834c98d31c29e0: Status 404 returned error can't find the container with id 5ce81bf82999359ff2a8e6a4f50fc583dff117f85e61822a96834c98d31c29e0 Apr 22 14:15:59.891885 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.891849 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" event={"ID":"6731ebe9-2efe-4307-a635-926bb91326dd","Type":"ContainerStarted","Data":"64c17ad11a39e326889f186a55c8aaae625f85e940eaae741a568f36167968ce"} Apr 22 14:15:59.892354 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.892029 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:59.893710 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.893679 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" event={"ID":"10eb118e-4820-4012-9cec-b6bf9fd92cb6","Type":"ContainerStarted","Data":"d3da9003a98669cc75b5e24673047e78595dd39b6620cccf2379c587cae3a8b0"} Apr 22 14:15:59.894091 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.894071 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:15:59.894868 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.894841 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dghbg" event={"ID":"a3986444-d7dd-4409-b107-157fc81b5e02","Type":"ContainerStarted","Data":"5ce81bf82999359ff2a8e6a4f50fc583dff117f85e61822a96834c98d31c29e0"} Apr 22 14:15:59.898035 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.898012 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ng46h" event={"ID":"57899f5d-95c9-4f88-8a37-538507647859","Type":"ContainerStarted","Data":"4315a79fdc52953d9a005b5d0cb652016bbb696de46820c25b7f599efb9c935a"} Apr 22 14:15:59.899325 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.899297 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" event={"ID":"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8","Type":"ContainerStarted","Data":"664c3551f2c643acbe5a741f11cff07a06f94548ac1a1b17b61f83ec8179c675"} Apr 22 14:15:59.900560 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.900541 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w9djl" event={"ID":"ab2a1f01-aab3-488d-8a5c-09e7a9568954","Type":"ContainerStarted","Data":"a418436fe1a25c3f4e62f76f22ee2710fd1089153c22d455282594147afa7f08"} Apr 22 14:15:59.900703 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.900688 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:15:59.913610 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.913556 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" podStartSLOduration=2.666809819 podStartE2EDuration="9.913540627s" podCreationTimestamp="2026-04-22 14:15:50 +0000 UTC" firstStartedPulling="2026-04-22 14:15:51.53879907 +0000 UTC m=+33.518310431" lastFinishedPulling="2026-04-22 14:15:58.785529879 +0000 UTC m=+40.765041239" observedRunningTime="2026-04-22 14:15:59.913243835 +0000 UTC m=+41.892755214" watchObservedRunningTime="2026-04-22 14:15:59.913540627 +0000 UTC m=+41.893052008" Apr 22 14:15:59.935835 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.935764 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-w9djl" podStartSLOduration=35.803319921 podStartE2EDuration="41.93574565s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:52.638074244 +0000 UTC m=+34.617585610" lastFinishedPulling="2026-04-22 14:15:58.770499966 +0000 UTC m=+40.750011339" observedRunningTime="2026-04-22 14:15:59.934861711 +0000 UTC m=+41.914373090" watchObservedRunningTime="2026-04-22 14:15:59.93574565 +0000 UTC m=+41.915257030" Apr 22 14:15:59.966573 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.966516 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ng46h" podStartSLOduration=11.614789666 podStartE2EDuration="41.96649629s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:15:21.209650188 +0000 UTC m=+3.189161545" lastFinishedPulling="2026-04-22 14:15:51.561356812 +0000 UTC m=+33.540868169" observedRunningTime="2026-04-22 14:15:59.964336371 +0000 UTC m=+41.943847751" watchObservedRunningTime="2026-04-22 14:15:59.96649629 +0000 UTC m=+41.946007670" Apr 22 14:15:59.991185 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:15:59.991131 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" podStartSLOduration=2.760718254 podStartE2EDuration="9.991109945s" podCreationTimestamp="2026-04-22 14:15:50 +0000 UTC" firstStartedPulling="2026-04-22 14:15:51.539149143 +0000 UTC m=+33.518660501" lastFinishedPulling="2026-04-22 14:15:58.76954083 +0000 UTC m=+40.749052192" observedRunningTime="2026-04-22 14:15:59.989903947 +0000 UTC m=+41.969415326" watchObservedRunningTime="2026-04-22 14:15:59.991109945 +0000 UTC m=+41.970621324" Apr 22 14:16:03.912504 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:03.912466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dghbg" event={"ID":"a3986444-d7dd-4409-b107-157fc81b5e02","Type":"ContainerStarted","Data":"ce985b307f0fc17b7787ebedc6a2ee5b5fb3443f1096c8228b2474d0ccfd76e7"} Apr 22 14:16:03.928992 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:03.928894 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dghbg" podStartSLOduration=38.529260276 podStartE2EDuration="42.928876618s" podCreationTimestamp="2026-04-22 14:15:21 +0000 UTC" firstStartedPulling="2026-04-22 14:15:58.921853553 +0000 UTC m=+40.901364929" lastFinishedPulling="2026-04-22 14:16:03.32146991 +0000 UTC m=+45.300981271" observedRunningTime="2026-04-22 14:16:03.928386682 +0000 UTC m=+45.907898062" watchObservedRunningTime="2026-04-22 14:16:03.928876618 +0000 UTC m=+45.908388001" Apr 22 14:16:05.920646 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:05.920606 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" event={"ID":"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8","Type":"ContainerStarted","Data":"959f1ccdf002ac0c0fa0fb2593b0400de522217d07e4ba8f4f8ab003eb3adce4"} Apr 22 14:16:05.921027 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:05.920654 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" event={"ID":"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8","Type":"ContainerStarted","Data":"145e6eb563123c10d6afb6f849dbe68da18deee2de4c6d3bda5d573343c38344"} Apr 22 14:16:05.940992 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:05.940941 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" podStartSLOduration=2.421436893 podStartE2EDuration="15.940927016s" podCreationTimestamp="2026-04-22 14:15:50 +0000 UTC" firstStartedPulling="2026-04-22 14:15:51.539010978 +0000 UTC m=+33.518522339" lastFinishedPulling="2026-04-22 14:16:05.058501101 +0000 UTC m=+47.038012462" observedRunningTime="2026-04-22 14:16:05.940097275 +0000 UTC m=+47.919608655" watchObservedRunningTime="2026-04-22 14:16:05.940927016 +0000 UTC m=+47.920438392" Apr 22 14:16:06.285841 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:06.285743 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:16:06.285841 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:06.285798 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:16:06.286021 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:06.285887 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:06.286021 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:06.285907 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779f98969c-lk4m8: secret "image-registry-tls" not found Apr 22 14:16:06.286021 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:06.285958 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls podName:0160c8de-74a1-4adf-854c-2efcb3f7ab8e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:22.285943576 +0000 UTC m=+64.265454949 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls") pod "image-registry-779f98969c-lk4m8" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e") : secret "image-registry-tls" not found Apr 22 14:16:06.286021 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:06.285999 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:06.286149 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:06.286047 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert podName:3da3d9ed-4865-4d4d-a429-13417afb99df nodeName:}" failed. No retries permitted until 2026-04-22 14:16:22.286034165 +0000 UTC m=+64.265545524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert") pod "ingress-canary-f4zzk" (UID: "3da3d9ed-4865-4d4d-a429-13417afb99df") : secret "canary-serving-cert" not found Apr 22 14:16:06.386755 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:06.386710 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:16:06.386883 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:06.386848 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:06.386919 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:06.386910 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls podName:d568b082-5592-4166-8e56-4b3f5d03022f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:22.386894466 +0000 UTC m=+64.366405828 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls") pod "dns-default-22qt9" (UID: "d568b082-5592-4166-8e56-4b3f5d03022f") : secret "dns-default-metrics-tls" not found Apr 22 14:16:18.851612 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:18.851583 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-47psb" Apr 22 14:16:22.304167 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:22.304130 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:16:22.304581 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:22.304177 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:16:22.304581 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:22.304275 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:22.304581 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:22.304281 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:22.304581 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:22.304299 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779f98969c-lk4m8: secret "image-registry-tls" not found Apr 22 14:16:22.304581 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:22.304336 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert podName:3da3d9ed-4865-4d4d-a429-13417afb99df nodeName:}" failed. No retries permitted until 2026-04-22 14:16:54.304322355 +0000 UTC m=+96.283833711 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert") pod "ingress-canary-f4zzk" (UID: "3da3d9ed-4865-4d4d-a429-13417afb99df") : secret "canary-serving-cert" not found Apr 22 14:16:22.304581 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:22.304354 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls podName:0160c8de-74a1-4adf-854c-2efcb3f7ab8e nodeName:}" failed. No retries permitted until 2026-04-22 14:16:54.30434107 +0000 UTC m=+96.283852428 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls") pod "image-registry-779f98969c-lk4m8" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e") : secret "image-registry-tls" not found Apr 22 14:16:22.405563 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:22.405526 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:16:22.405703 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:22.405649 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:22.405746 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:22.405706 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls podName:d568b082-5592-4166-8e56-4b3f5d03022f nodeName:}" failed. No retries permitted until 2026-04-22 14:16:54.405689487 +0000 UTC m=+96.385200846 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls") pod "dns-default-22qt9" (UID: "d568b082-5592-4166-8e56-4b3f5d03022f") : secret "dns-default-metrics-tls" not found Apr 22 14:16:24.320538 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:24.320493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:16:24.320916 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:24.320609 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:24.320916 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:24.320665 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:17:28.32065043 +0000 UTC m=+130.300161788 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : secret "metrics-daemon-secret" not found Apr 22 14:16:30.905090 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:30.905054 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w9djl" Apr 22 14:16:54.343527 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:54.343343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:16:54.343527 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:54.343405 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:16:54.344091 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:54.343539 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:16:54.344091 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:54.343568 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-779f98969c-lk4m8: secret "image-registry-tls" not found Apr 22 14:16:54.344091 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:54.343634 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls podName:0160c8de-74a1-4adf-854c-2efcb3f7ab8e nodeName:}" failed. No retries permitted until 2026-04-22 14:17:58.343610719 +0000 UTC m=+160.323122082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls") pod "image-registry-779f98969c-lk4m8" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e") : secret "image-registry-tls" not found Apr 22 14:16:54.344091 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:54.343548 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:54.344091 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:54.343691 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert podName:3da3d9ed-4865-4d4d-a429-13417afb99df nodeName:}" failed. No retries permitted until 2026-04-22 14:17:58.343676122 +0000 UTC m=+160.323187492 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert") pod "ingress-canary-f4zzk" (UID: "3da3d9ed-4865-4d4d-a429-13417afb99df") : secret "canary-serving-cert" not found Apr 22 14:16:54.443897 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:16:54.443844 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:16:54.444060 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:54.443984 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:54.444060 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:16:54.444047 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls podName:d568b082-5592-4166-8e56-4b3f5d03022f nodeName:}" failed. No retries permitted until 2026-04-22 14:17:58.444031054 +0000 UTC m=+160.423542411 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls") pod "dns-default-22qt9" (UID: "d568b082-5592-4166-8e56-4b3f5d03022f") : secret "dns-default-metrics-tls" not found Apr 22 14:17:28.393449 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:28.393383 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:17:28.393923 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:17:28.393537 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:17:28.393923 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:17:28.393630 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs podName:ff0fda3b-a631-4479-bca1-451b3fd7ac2f nodeName:}" failed. No retries permitted until 2026-04-22 14:19:30.393611804 +0000 UTC m=+252.373123162 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs") pod "network-metrics-daemon-9rgrl" (UID: "ff0fda3b-a631-4479-bca1-451b3fd7ac2f") : secret "metrics-daemon-secret" not found Apr 22 14:17:46.356536 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:46.356510 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6rsgf_033bc69f-f51f-40ca-8484-1ae2dc580b53/dns-node-resolver/0.log" Apr 22 14:17:47.752613 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:47.752586 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zmgl9_1c429a6e-0682-4fe6-9ec0-b39e350ccc63/node-ca/0.log" Apr 22 14:17:53.468015 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:17:53.467968 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" podUID="0160c8de-74a1-4adf-854c-2efcb3f7ab8e" Apr 22 14:17:53.480132 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:17:53.480108 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-f4zzk" podUID="3da3d9ed-4865-4d4d-a429-13417afb99df" Apr 22 14:17:53.507376 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:17:53.507327 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-22qt9" podUID="d568b082-5592-4166-8e56-4b3f5d03022f" Apr 22 14:17:53.671513 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:17:53.671472 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9rgrl" podUID="ff0fda3b-a631-4479-bca1-451b3fd7ac2f" Apr 22 14:17:54.172490 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:54.172460 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:17:58.420588 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.420528 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:17:58.420588 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.420596 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:17:58.423165 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.423141 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3da3d9ed-4865-4d4d-a429-13417afb99df-cert\") pod \"ingress-canary-f4zzk\" (UID: \"3da3d9ed-4865-4d4d-a429-13417afb99df\") " pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:17:58.423272 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.423252 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"image-registry-779f98969c-lk4m8\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:17:58.521347 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.521315 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:17:58.523597 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.523578 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d568b082-5592-4166-8e56-4b3f5d03022f-metrics-tls\") pod \"dns-default-22qt9\" (UID: \"d568b082-5592-4166-8e56-4b3f5d03022f\") " pod="openshift-dns/dns-default-22qt9" Apr 22 14:17:58.675772 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.675689 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-slf7l\"" Apr 22 14:17:58.683110 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.683085 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f4zzk" Apr 22 14:17:58.799928 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:58.799898 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f4zzk"] Apr 22 14:17:58.802837 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:17:58.802804 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3da3d9ed_4865_4d4d_a429_13417afb99df.slice/crio-b48db4ec67c8fcf6aa6a4a9a636da0a6ea60ab1783fdece7b28c34882daba724 WatchSource:0}: Error finding container b48db4ec67c8fcf6aa6a4a9a636da0a6ea60ab1783fdece7b28c34882daba724: Status 404 returned error can't find the container with id b48db4ec67c8fcf6aa6a4a9a636da0a6ea60ab1783fdece7b28c34882daba724 Apr 22 14:17:59.186051 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:59.186016 2566 generic.go:358] "Generic (PLEG): container finished" podID="6731ebe9-2efe-4307-a635-926bb91326dd" containerID="64c17ad11a39e326889f186a55c8aaae625f85e940eaae741a568f36167968ce" exitCode=1 Apr 22 14:17:59.186308 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:59.186094 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" event={"ID":"6731ebe9-2efe-4307-a635-926bb91326dd","Type":"ContainerDied","Data":"64c17ad11a39e326889f186a55c8aaae625f85e940eaae741a568f36167968ce"} Apr 22 14:17:59.186500 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:59.186484 2566 scope.go:117] "RemoveContainer" containerID="64c17ad11a39e326889f186a55c8aaae625f85e940eaae741a568f36167968ce" Apr 22 14:17:59.187301 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:59.187279 2566 generic.go:358] "Generic (PLEG): container finished" podID="10eb118e-4820-4012-9cec-b6bf9fd92cb6" containerID="d3da9003a98669cc75b5e24673047e78595dd39b6620cccf2379c587cae3a8b0" exitCode=255 Apr 22 14:17:59.187379 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:59.187342 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" event={"ID":"10eb118e-4820-4012-9cec-b6bf9fd92cb6","Type":"ContainerDied","Data":"d3da9003a98669cc75b5e24673047e78595dd39b6620cccf2379c587cae3a8b0"} Apr 22 14:17:59.187650 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:59.187631 2566 scope.go:117] "RemoveContainer" containerID="d3da9003a98669cc75b5e24673047e78595dd39b6620cccf2379c587cae3a8b0" Apr 22 14:17:59.188490 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:59.188467 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f4zzk" event={"ID":"3da3d9ed-4865-4d4d-a429-13417afb99df","Type":"ContainerStarted","Data":"b48db4ec67c8fcf6aa6a4a9a636da0a6ea60ab1783fdece7b28c34882daba724"} Apr 22 14:17:59.892814 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:17:59.892772 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:18:00.192911 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:00.192822 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" event={"ID":"6731ebe9-2efe-4307-a635-926bb91326dd","Type":"ContainerStarted","Data":"3f3a734d3ac5c5e360f074e4fb97247efba6cc445e4d474d551e57dc83d6dc4b"} Apr 22 14:18:00.193097 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:00.193027 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:18:00.193776 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:00.193753 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7849bfffb6-66jvs" Apr 22 14:18:00.194575 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:00.194551 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-644884c4c7-cnxnn" event={"ID":"10eb118e-4820-4012-9cec-b6bf9fd92cb6","Type":"ContainerStarted","Data":"c094a7f386df15641db757b9d7cfe349c84c49ceebcf493af93c8d57d07a0c19"} Apr 22 14:18:01.198095 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:01.198053 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f4zzk" event={"ID":"3da3d9ed-4865-4d4d-a429-13417afb99df","Type":"ContainerStarted","Data":"48b27a80445caa6a83388884c980da6f86b1c47e837a14a99953f9a27bf2404d"} Apr 22 14:18:01.219583 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:01.219535 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-f4zzk" podStartSLOduration=129.652226134 podStartE2EDuration="2m11.219520227s" podCreationTimestamp="2026-04-22 14:15:50 +0000 UTC" firstStartedPulling="2026-04-22 14:17:58.804553045 +0000 UTC m=+160.784064402" lastFinishedPulling="2026-04-22 14:18:00.371847128 +0000 UTC m=+162.351358495" observedRunningTime="2026-04-22 14:18:01.218828376 +0000 UTC m=+163.198339754" watchObservedRunningTime="2026-04-22 14:18:01.219520227 +0000 UTC m=+163.199031607" Apr 22 14:18:03.629409 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:03.629376 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:18:03.632420 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:03.632392 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xbchn\"" Apr 22 14:18:03.639853 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:03.639834 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:18:03.761933 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:03.761902 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-779f98969c-lk4m8"] Apr 22 14:18:03.764903 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:18:03.764880 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0160c8de_74a1_4adf_854c_2efcb3f7ab8e.slice/crio-378745f5fd1f758b98b6cba782dcda9101441f3c554501cd87344987fcd13112 WatchSource:0}: Error finding container 378745f5fd1f758b98b6cba782dcda9101441f3c554501cd87344987fcd13112: Status 404 returned error can't find the container with id 378745f5fd1f758b98b6cba782dcda9101441f3c554501cd87344987fcd13112 Apr 22 14:18:04.208714 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:04.208669 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" event={"ID":"0160c8de-74a1-4adf-854c-2efcb3f7ab8e","Type":"ContainerStarted","Data":"5befc7fb99adfdd552d0e9d22a3f97492a60cb5c7885dc15ac5ea7c80a7ea998"} Apr 22 14:18:04.208714 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:04.208709 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" event={"ID":"0160c8de-74a1-4adf-854c-2efcb3f7ab8e","Type":"ContainerStarted","Data":"378745f5fd1f758b98b6cba782dcda9101441f3c554501cd87344987fcd13112"} Apr 22 14:18:04.208950 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:04.208793 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:18:04.233633 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:04.233585 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" podStartSLOduration=165.233570924 podStartE2EDuration="2m45.233570924s" podCreationTimestamp="2026-04-22 14:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:18:04.232391871 +0000 UTC m=+166.211903251" watchObservedRunningTime="2026-04-22 14:18:04.233570924 +0000 UTC m=+166.213082302" Apr 22 14:18:06.629670 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:06.629633 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:18:07.629784 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:07.629691 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-22qt9" Apr 22 14:18:07.632746 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:07.632722 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fpnmw\"" Apr 22 14:18:07.640441 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:07.640393 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-22qt9" Apr 22 14:18:07.767141 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:07.767055 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-22qt9"] Apr 22 14:18:07.769589 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:18:07.769564 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd568b082_5592_4166_8e56_4b3f5d03022f.slice/crio-6316407f90edddce4a4096f046bed68f3dcb54a4ef008969e90a42f68d74f6cb WatchSource:0}: Error finding container 6316407f90edddce4a4096f046bed68f3dcb54a4ef008969e90a42f68d74f6cb: Status 404 returned error can't find the container with id 6316407f90edddce4a4096f046bed68f3dcb54a4ef008969e90a42f68d74f6cb Apr 22 14:18:08.218776 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:08.218743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-22qt9" event={"ID":"d568b082-5592-4166-8e56-4b3f5d03022f","Type":"ContainerStarted","Data":"6316407f90edddce4a4096f046bed68f3dcb54a4ef008969e90a42f68d74f6cb"} Apr 22 14:18:09.223018 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:09.222983 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-22qt9" event={"ID":"d568b082-5592-4166-8e56-4b3f5d03022f","Type":"ContainerStarted","Data":"2c8d4ae3c362236192ebec7c5d984ca7b4fa0f66692b838cfbb32be7d4d5e1ca"} Apr 22 14:18:10.226943 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.226910 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-22qt9" event={"ID":"d568b082-5592-4166-8e56-4b3f5d03022f","Type":"ContainerStarted","Data":"dd958e721f8c4b55283e207a1fe7edc916ab5cc1fa09bde51bcc85a0a2868c85"} Apr 22 14:18:10.227333 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.227066 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-22qt9" Apr 22 14:18:10.244928 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.244881 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-22qt9" podStartSLOduration=138.931364426 podStartE2EDuration="2m20.244867002s" podCreationTimestamp="2026-04-22 14:15:50 +0000 UTC" firstStartedPulling="2026-04-22 14:18:07.771675581 +0000 UTC m=+169.751186939" lastFinishedPulling="2026-04-22 14:18:09.085178159 +0000 UTC m=+171.064689515" observedRunningTime="2026-04-22 14:18:10.243125767 +0000 UTC m=+172.222637146" watchObservedRunningTime="2026-04-22 14:18:10.244867002 +0000 UTC m=+172.224378381" Apr 22 14:18:10.987655 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.987621 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ctlbm"] Apr 22 14:18:10.990916 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.990895 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:10.995091 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.995068 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:18:10.995223 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.995068 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:18:10.996305 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.996288 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:18:10.996417 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.996359 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:18:10.996501 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:10.996414 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7qqgh\"" Apr 22 14:18:11.019860 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.019833 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ctlbm"] Apr 22 14:18:11.115201 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.115172 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93431b3f-da56-4a76-93a2-98e25d7809a5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.115358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.115246 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93431b3f-da56-4a76-93a2-98e25d7809a5-crio-socket\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.115358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.115280 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93431b3f-da56-4a76-93a2-98e25d7809a5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.115358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.115338 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gdqn\" (UniqueName: \"kubernetes.io/projected/93431b3f-da56-4a76-93a2-98e25d7809a5-kube-api-access-6gdqn\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.115504 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.115361 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93431b3f-da56-4a76-93a2-98e25d7809a5-data-volume\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.216172 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.216141 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93431b3f-da56-4a76-93a2-98e25d7809a5-data-volume\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.216172 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.216174 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93431b3f-da56-4a76-93a2-98e25d7809a5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.216411 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.216215 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93431b3f-da56-4a76-93a2-98e25d7809a5-crio-socket\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.216411 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.216242 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93431b3f-da56-4a76-93a2-98e25d7809a5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.216411 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.216267 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gdqn\" (UniqueName: \"kubernetes.io/projected/93431b3f-da56-4a76-93a2-98e25d7809a5-kube-api-access-6gdqn\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.216411 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.216352 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93431b3f-da56-4a76-93a2-98e25d7809a5-crio-socket\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.216644 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.216587 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93431b3f-da56-4a76-93a2-98e25d7809a5-data-volume\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.216818 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.216796 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93431b3f-da56-4a76-93a2-98e25d7809a5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.218478 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.218454 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93431b3f-da56-4a76-93a2-98e25d7809a5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.225898 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.225880 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gdqn\" (UniqueName: \"kubernetes.io/projected/93431b3f-da56-4a76-93a2-98e25d7809a5-kube-api-access-6gdqn\") pod \"insights-runtime-extractor-ctlbm\" (UID: \"93431b3f-da56-4a76-93a2-98e25d7809a5\") " pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.299425 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.299341 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ctlbm" Apr 22 14:18:11.431167 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:11.429263 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ctlbm"] Apr 22 14:18:11.434640 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:18:11.434610 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93431b3f_da56_4a76_93a2_98e25d7809a5.slice/crio-56d1c5b8bf29fe1b1c17bffbbbba6ee0431c3d48b121ba0e8e0c5d13fb815c96 WatchSource:0}: Error finding container 56d1c5b8bf29fe1b1c17bffbbbba6ee0431c3d48b121ba0e8e0c5d13fb815c96: Status 404 returned error can't find the container with id 56d1c5b8bf29fe1b1c17bffbbbba6ee0431c3d48b121ba0e8e0c5d13fb815c96 Apr 22 14:18:12.234296 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:12.234257 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ctlbm" event={"ID":"93431b3f-da56-4a76-93a2-98e25d7809a5","Type":"ContainerStarted","Data":"4242ce6c02a8be20cba50df552f899733ccd52a5102cbf237daf51ecdadff3b6"} Apr 22 14:18:12.234296 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:12.234293 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ctlbm" event={"ID":"93431b3f-da56-4a76-93a2-98e25d7809a5","Type":"ContainerStarted","Data":"302fb0bcf1b65421d254763fb8c577afc44742e5909d6a77c8c057992cbbcc7e"} Apr 22 14:18:12.234447 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:12.234302 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ctlbm" event={"ID":"93431b3f-da56-4a76-93a2-98e25d7809a5","Type":"ContainerStarted","Data":"56d1c5b8bf29fe1b1c17bffbbbba6ee0431c3d48b121ba0e8e0c5d13fb815c96"} Apr 22 14:18:14.241175 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:14.241141 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ctlbm" event={"ID":"93431b3f-da56-4a76-93a2-98e25d7809a5","Type":"ContainerStarted","Data":"f9f70e8c60fd695cd8232bd086a5720a13d1db3e8605c2a9cb33c87061ee3a0c"} Apr 22 14:18:14.265161 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:14.265114 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ctlbm" podStartSLOduration=2.355904616 podStartE2EDuration="4.265086386s" podCreationTimestamp="2026-04-22 14:18:10 +0000 UTC" firstStartedPulling="2026-04-22 14:18:11.486093153 +0000 UTC m=+173.465604513" lastFinishedPulling="2026-04-22 14:18:13.395274923 +0000 UTC m=+175.374786283" observedRunningTime="2026-04-22 14:18:14.26455679 +0000 UTC m=+176.244068168" watchObservedRunningTime="2026-04-22 14:18:14.265086386 +0000 UTC m=+176.244597764" Apr 22 14:18:20.233007 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:20.232977 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-22qt9" Apr 22 14:18:22.014520 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.014492 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4sg67"] Apr 22 14:18:22.017753 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.017737 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.020711 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.020682 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:18:22.020839 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.020706 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:18:22.020839 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.020798 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:18:22.021493 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.021476 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wnzx7\"" Apr 22 14:18:22.021617 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.021603 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:18:22.022065 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.022051 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:18:22.022115 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.022092 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:18:22.100358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100319 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2wjg\" (UniqueName: \"kubernetes.io/projected/7a7698ea-e37f-4471-b0da-79820ee5c2ef-kube-api-access-c2wjg\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.100563 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100375 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.100563 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100426 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-root\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.100563 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100471 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-wtmp\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.100563 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100496 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-textfile\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.100563 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100519 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.100749 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100574 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a7698ea-e37f-4471-b0da-79820ee5c2ef-metrics-client-ca\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.100749 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100637 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-tls\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.100749 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.100673 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-sys\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.201668 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201634 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-tls\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.201827 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-sys\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.201827 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201697 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2wjg\" (UniqueName: \"kubernetes.io/projected/7a7698ea-e37f-4471-b0da-79820ee5c2ef-kube-api-access-c2wjg\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.201827 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201743 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-sys\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.201827 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201781 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.201827 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201806 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-root\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.201827 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:18:22.201812 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 14:18:22.202121 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201833 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-wtmp\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.202121 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201866 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-textfile\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.202121 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:18:22.201901 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-tls podName:7a7698ea-e37f-4471-b0da-79820ee5c2ef nodeName:}" failed. No retries permitted until 2026-04-22 14:18:22.701879511 +0000 UTC m=+184.681390885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-tls") pod "node-exporter-4sg67" (UID: "7a7698ea-e37f-4471-b0da-79820ee5c2ef") : secret "node-exporter-tls" not found Apr 22 14:18:22.202121 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201913 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-root\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.202121 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201948 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.202121 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.201982 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a7698ea-e37f-4471-b0da-79820ee5c2ef-metrics-client-ca\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.202121 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.202010 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-wtmp\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.202501 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.202411 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-accelerators-collector-config\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.202603 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.202587 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-textfile\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.203018 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.202999 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a7698ea-e37f-4471-b0da-79820ee5c2ef-metrics-client-ca\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.204132 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.204105 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.210849 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.210823 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2wjg\" (UniqueName: \"kubernetes.io/projected/7a7698ea-e37f-4471-b0da-79820ee5c2ef-kube-api-access-c2wjg\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.706338 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.706298 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-tls\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.708720 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.708694 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7a7698ea-e37f-4471-b0da-79820ee5c2ef-node-exporter-tls\") pod \"node-exporter-4sg67\" (UID: \"7a7698ea-e37f-4471-b0da-79820ee5c2ef\") " pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.926579 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:22.926547 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4sg67" Apr 22 14:18:22.934791 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:18:22.934761 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a7698ea_e37f_4471_b0da_79820ee5c2ef.slice/crio-b76859dd5246f58a5c6aa3b3efa89202c7a79dd6e6f42a7fd41590c3feebce16 WatchSource:0}: Error finding container b76859dd5246f58a5c6aa3b3efa89202c7a79dd6e6f42a7fd41590c3feebce16: Status 404 returned error can't find the container with id b76859dd5246f58a5c6aa3b3efa89202c7a79dd6e6f42a7fd41590c3feebce16 Apr 22 14:18:23.266000 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:23.265960 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4sg67" event={"ID":"7a7698ea-e37f-4471-b0da-79820ee5c2ef","Type":"ContainerStarted","Data":"b76859dd5246f58a5c6aa3b3efa89202c7a79dd6e6f42a7fd41590c3feebce16"} Apr 22 14:18:24.270328 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:24.270286 2566 generic.go:358] "Generic (PLEG): container finished" podID="7a7698ea-e37f-4471-b0da-79820ee5c2ef" containerID="f46b07b257c0440aff830ee8fc9a6370d9db9cea687f0d20281904c09f33add2" exitCode=0 Apr 22 14:18:24.270728 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:24.270348 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4sg67" event={"ID":"7a7698ea-e37f-4471-b0da-79820ee5c2ef","Type":"ContainerDied","Data":"f46b07b257c0440aff830ee8fc9a6370d9db9cea687f0d20281904c09f33add2"} Apr 22 14:18:25.215027 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:25.214994 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:18:25.275592 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:25.275563 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4sg67" event={"ID":"7a7698ea-e37f-4471-b0da-79820ee5c2ef","Type":"ContainerStarted","Data":"95d632520c221bfbc645e39f41944f7d0341e1485c788d140b9a14cc702148f3"} Apr 22 14:18:25.275592 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:25.275596 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4sg67" event={"ID":"7a7698ea-e37f-4471-b0da-79820ee5c2ef","Type":"ContainerStarted","Data":"43980607be1ccc0adfc4f71bfc7bcccadc0e73c7a4d30a78e31a24908248dcaf"} Apr 22 14:18:25.300986 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:25.300926 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4sg67" podStartSLOduration=3.572746737 podStartE2EDuration="4.300908275s" podCreationTimestamp="2026-04-22 14:18:21 +0000 UTC" firstStartedPulling="2026-04-22 14:18:22.936593603 +0000 UTC m=+184.916104959" lastFinishedPulling="2026-04-22 14:18:23.664755126 +0000 UTC m=+185.644266497" observedRunningTime="2026-04-22 14:18:25.300595597 +0000 UTC m=+187.280106976" watchObservedRunningTime="2026-04-22 14:18:25.300908275 +0000 UTC m=+187.280419657" Apr 22 14:18:33.151721 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:33.151683 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-779f98969c-lk4m8"] Apr 22 14:18:51.026465 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:51.026389 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" podUID="be5941d8-31ae-4b55-ab1f-4fe7c679b8c8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:18:58.169737 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.169670 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" podUID="0160c8de-74a1-4adf-854c-2efcb3f7ab8e" containerName="registry" containerID="cri-o://5befc7fb99adfdd552d0e9d22a3f97492a60cb5c7885dc15ac5ea7c80a7ea998" gracePeriod=30 Apr 22 14:18:58.358988 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.358958 2566 generic.go:358] "Generic (PLEG): container finished" podID="0160c8de-74a1-4adf-854c-2efcb3f7ab8e" containerID="5befc7fb99adfdd552d0e9d22a3f97492a60cb5c7885dc15ac5ea7c80a7ea998" exitCode=0 Apr 22 14:18:58.359117 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.359026 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" event={"ID":"0160c8de-74a1-4adf-854c-2efcb3f7ab8e","Type":"ContainerDied","Data":"5befc7fb99adfdd552d0e9d22a3f97492a60cb5c7885dc15ac5ea7c80a7ea998"} Apr 22 14:18:58.409032 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.409008 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:18:58.468754 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.468674 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-ca-trust-extracted\") pod \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " Apr 22 14:18:58.468754 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.468718 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-trusted-ca\") pod \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " Apr 22 14:18:58.468754 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.468742 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-bound-sa-token\") pod \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " Apr 22 14:18:58.469003 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.468768 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cld6j\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-kube-api-access-cld6j\") pod \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " Apr 22 14:18:58.469003 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.468808 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-certificates\") pod \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " Apr 22 14:18:58.469003 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.468838 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-installation-pull-secrets\") pod \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " Apr 22 14:18:58.469003 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.468869 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") pod \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " Apr 22 14:18:58.469003 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.468911 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-image-registry-private-configuration\") pod \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\" (UID: \"0160c8de-74a1-4adf-854c-2efcb3f7ab8e\") " Apr 22 14:18:58.469245 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.469169 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0160c8de-74a1-4adf-854c-2efcb3f7ab8e" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:58.469360 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.469323 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0160c8de-74a1-4adf-854c-2efcb3f7ab8e" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:58.471493 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.471446 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0160c8de-74a1-4adf-854c-2efcb3f7ab8e" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:58.471493 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.471472 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0160c8de-74a1-4adf-854c-2efcb3f7ab8e" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:58.471493 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.471457 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0160c8de-74a1-4adf-854c-2efcb3f7ab8e" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:58.471685 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.471538 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0160c8de-74a1-4adf-854c-2efcb3f7ab8e" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:58.471936 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.471908 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-kube-api-access-cld6j" (OuterVolumeSpecName: "kube-api-access-cld6j") pod "0160c8de-74a1-4adf-854c-2efcb3f7ab8e" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e"). InnerVolumeSpecName "kube-api-access-cld6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:58.477664 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.477642 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0160c8de-74a1-4adf-854c-2efcb3f7ab8e" (UID: "0160c8de-74a1-4adf-854c-2efcb3f7ab8e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:18:58.570095 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.570066 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-ca-trust-extracted\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:18:58.570095 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.570091 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-trusted-ca\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:18:58.570095 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.570101 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-bound-sa-token\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:18:58.570358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.570109 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cld6j\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-kube-api-access-cld6j\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:18:58.570358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.570118 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-certificates\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:18:58.570358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.570127 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-installation-pull-secrets\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:18:58.570358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.570136 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-registry-tls\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:18:58.570358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:58.570144 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0160c8de-74a1-4adf-854c-2efcb3f7ab8e-image-registry-private-configuration\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:18:59.362429 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:59.362394 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" event={"ID":"0160c8de-74a1-4adf-854c-2efcb3f7ab8e","Type":"ContainerDied","Data":"378745f5fd1f758b98b6cba782dcda9101441f3c554501cd87344987fcd13112"} Apr 22 14:18:59.362883 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:59.362453 2566 scope.go:117] "RemoveContainer" containerID="5befc7fb99adfdd552d0e9d22a3f97492a60cb5c7885dc15ac5ea7c80a7ea998" Apr 22 14:18:59.362883 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:59.362408 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-779f98969c-lk4m8" Apr 22 14:18:59.380512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:59.380480 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-779f98969c-lk4m8"] Apr 22 14:18:59.382619 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:18:59.382598 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-779f98969c-lk4m8"] Apr 22 14:19:00.632353 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:00.632323 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0160c8de-74a1-4adf-854c-2efcb3f7ab8e" path="/var/lib/kubelet/pods/0160c8de-74a1-4adf-854c-2efcb3f7ab8e/volumes" Apr 22 14:19:01.026911 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:01.026834 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" podUID="be5941d8-31ae-4b55-ab1f-4fe7c679b8c8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:19:11.027167 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:11.027128 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" podUID="be5941d8-31ae-4b55-ab1f-4fe7c679b8c8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 14:19:11.027627 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:11.027206 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" Apr 22 14:19:11.027791 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:11.027758 2566 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"959f1ccdf002ac0c0fa0fb2593b0400de522217d07e4ba8f4f8ab003eb3adce4"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 14:19:11.027838 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:11.027817 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" podUID="be5941d8-31ae-4b55-ab1f-4fe7c679b8c8" containerName="service-proxy" containerID="cri-o://959f1ccdf002ac0c0fa0fb2593b0400de522217d07e4ba8f4f8ab003eb3adce4" gracePeriod=30 Apr 22 14:19:11.399050 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:11.397946 2566 generic.go:358] "Generic (PLEG): container finished" podID="be5941d8-31ae-4b55-ab1f-4fe7c679b8c8" containerID="959f1ccdf002ac0c0fa0fb2593b0400de522217d07e4ba8f4f8ab003eb3adce4" exitCode=2 Apr 22 14:19:11.399050 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:11.398000 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" event={"ID":"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8","Type":"ContainerDied","Data":"959f1ccdf002ac0c0fa0fb2593b0400de522217d07e4ba8f4f8ab003eb3adce4"} Apr 22 14:19:11.399050 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:11.398031 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5874c5788b-4jfgb" event={"ID":"be5941d8-31ae-4b55-ab1f-4fe7c679b8c8","Type":"ContainerStarted","Data":"e16313cfd95cea93380669d78b1e28f10426d8d55c4ec378e01c5b147a4ef3ab"} Apr 22 14:19:30.402714 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:30.402665 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:19:30.404955 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:30.404932 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff0fda3b-a631-4479-bca1-451b3fd7ac2f-metrics-certs\") pod \"network-metrics-daemon-9rgrl\" (UID: \"ff0fda3b-a631-4479-bca1-451b3fd7ac2f\") " pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:19:30.633395 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:30.633361 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-r76nx\"" Apr 22 14:19:30.640806 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:30.640785 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9rgrl" Apr 22 14:19:30.755883 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:30.755861 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9rgrl"] Apr 22 14:19:30.758343 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:19:30.758315 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0fda3b_a631_4479_bca1_451b3fd7ac2f.slice/crio-232e00e99575cd1c31abd7d42578b1895a28aaf3e15d83dd0f92589dcfdea7b6 WatchSource:0}: Error finding container 232e00e99575cd1c31abd7d42578b1895a28aaf3e15d83dd0f92589dcfdea7b6: Status 404 returned error can't find the container with id 232e00e99575cd1c31abd7d42578b1895a28aaf3e15d83dd0f92589dcfdea7b6 Apr 22 14:19:31.452256 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:31.452211 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9rgrl" event={"ID":"ff0fda3b-a631-4479-bca1-451b3fd7ac2f","Type":"ContainerStarted","Data":"232e00e99575cd1c31abd7d42578b1895a28aaf3e15d83dd0f92589dcfdea7b6"} Apr 22 14:19:32.456325 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:32.456281 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9rgrl" event={"ID":"ff0fda3b-a631-4479-bca1-451b3fd7ac2f","Type":"ContainerStarted","Data":"545686ba1d1ef7cc086c51f32c638097d816cf12216ab214aba2e2e245f2e073"} Apr 22 14:19:32.456700 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:32.456331 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9rgrl" event={"ID":"ff0fda3b-a631-4479-bca1-451b3fd7ac2f","Type":"ContainerStarted","Data":"5ea7b6298b68abf584d36c64548ffb415170e20735430be8f8727c6916561727"} Apr 22 14:19:32.473476 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:19:32.473409 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9rgrl" podStartSLOduration=253.474199652 podStartE2EDuration="4m14.473392214s" podCreationTimestamp="2026-04-22 14:15:18 +0000 UTC" firstStartedPulling="2026-04-22 14:19:30.760722864 +0000 UTC m=+252.740234233" lastFinishedPulling="2026-04-22 14:19:31.759915434 +0000 UTC m=+253.739426795" observedRunningTime="2026-04-22 14:19:32.473370004 +0000 UTC m=+254.452881385" watchObservedRunningTime="2026-04-22 14:19:32.473392214 +0000 UTC m=+254.452903593" Apr 22 14:20:18.495944 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:20:18.495914 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:20:18.498649 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:20:18.498628 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:20:18.503514 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:20:18.503494 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:23:11.918394 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.918359 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-c8tlw"] Apr 22 14:23:11.918810 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.918619 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0160c8de-74a1-4adf-854c-2efcb3f7ab8e" containerName="registry" Apr 22 14:23:11.918810 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.918632 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0160c8de-74a1-4adf-854c-2efcb3f7ab8e" containerName="registry" Apr 22 14:23:11.918810 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.918681 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0160c8de-74a1-4adf-854c-2efcb3f7ab8e" containerName="registry" Apr 22 14:23:11.920243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.920227 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:11.923117 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.923094 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 14:23:11.923253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.923100 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:23:11.924267 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.924254 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ffw5m\"" Apr 22 14:23:11.928892 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.928875 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:23:11.944119 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:11.944097 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-c8tlw"] Apr 22 14:23:12.026787 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.026754 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6dqv\" (UniqueName: \"kubernetes.io/projected/1ee7ffd2-470a-44c8-81a5-65ad3bea8e63-kube-api-access-w6dqv\") pod \"seaweedfs-86cc847c5c-c8tlw\" (UID: \"1ee7ffd2-470a-44c8-81a5-65ad3bea8e63\") " pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:12.026961 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.026814 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ee7ffd2-470a-44c8-81a5-65ad3bea8e63-data\") pod \"seaweedfs-86cc847c5c-c8tlw\" (UID: \"1ee7ffd2-470a-44c8-81a5-65ad3bea8e63\") " pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:12.127182 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.127145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6dqv\" (UniqueName: \"kubernetes.io/projected/1ee7ffd2-470a-44c8-81a5-65ad3bea8e63-kube-api-access-w6dqv\") pod \"seaweedfs-86cc847c5c-c8tlw\" (UID: \"1ee7ffd2-470a-44c8-81a5-65ad3bea8e63\") " pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:12.127347 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.127209 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ee7ffd2-470a-44c8-81a5-65ad3bea8e63-data\") pod \"seaweedfs-86cc847c5c-c8tlw\" (UID: \"1ee7ffd2-470a-44c8-81a5-65ad3bea8e63\") " pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:12.127586 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.127565 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1ee7ffd2-470a-44c8-81a5-65ad3bea8e63-data\") pod \"seaweedfs-86cc847c5c-c8tlw\" (UID: \"1ee7ffd2-470a-44c8-81a5-65ad3bea8e63\") " pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:12.136725 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.136697 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6dqv\" (UniqueName: \"kubernetes.io/projected/1ee7ffd2-470a-44c8-81a5-65ad3bea8e63-kube-api-access-w6dqv\") pod \"seaweedfs-86cc847c5c-c8tlw\" (UID: \"1ee7ffd2-470a-44c8-81a5-65ad3bea8e63\") " pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:12.228583 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.228495 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:12.343599 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.343557 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-c8tlw"] Apr 22 14:23:12.346123 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:23:12.346096 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee7ffd2_470a_44c8_81a5_65ad3bea8e63.slice/crio-b0a8f331d4cd24f7a02d5cbeec4a0ba4492cad18967f899b78b33559a929fad7 WatchSource:0}: Error finding container b0a8f331d4cd24f7a02d5cbeec4a0ba4492cad18967f899b78b33559a929fad7: Status 404 returned error can't find the container with id b0a8f331d4cd24f7a02d5cbeec4a0ba4492cad18967f899b78b33559a929fad7 Apr 22 14:23:12.347360 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:12.347345 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:23:13.001809 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:13.001771 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-c8tlw" event={"ID":"1ee7ffd2-470a-44c8-81a5-65ad3bea8e63","Type":"ContainerStarted","Data":"b0a8f331d4cd24f7a02d5cbeec4a0ba4492cad18967f899b78b33559a929fad7"} Apr 22 14:23:16.011927 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:16.011896 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-c8tlw" event={"ID":"1ee7ffd2-470a-44c8-81a5-65ad3bea8e63","Type":"ContainerStarted","Data":"7b8e46336a0c5dc3c24a57bbd231ed61131b0f2639c296fb52695001f135727f"} Apr 22 14:23:16.012331 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:16.012045 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:23:16.031364 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:16.031320 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-c8tlw" podStartSLOduration=2.362815512 podStartE2EDuration="5.031304392s" podCreationTimestamp="2026-04-22 14:23:11 +0000 UTC" firstStartedPulling="2026-04-22 14:23:12.347485268 +0000 UTC m=+474.326996625" lastFinishedPulling="2026-04-22 14:23:15.015974146 +0000 UTC m=+476.995485505" observedRunningTime="2026-04-22 14:23:16.02984474 +0000 UTC m=+478.009356117" watchObservedRunningTime="2026-04-22 14:23:16.031304392 +0000 UTC m=+478.010815769" Apr 22 14:23:22.017519 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:23:22.017483 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-c8tlw" Apr 22 14:24:20.998666 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:20.998630 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-ggqbz"] Apr 22 14:24:21.001777 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.001755 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:21.004487 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.004469 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 14:24:21.004615 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.004490 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-rhqwj\"" Apr 22 14:24:21.012027 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.012007 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-ggqbz"] Apr 22 14:24:21.013000 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.012981 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-k5vb4"] Apr 22 14:24:21.015750 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.015734 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:21.018130 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.018106 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 14:24:21.018245 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.018177 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-rcnkw\"" Apr 22 14:24:21.025664 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.025640 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-k5vb4"] Apr 22 14:24:21.089872 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.089839 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxhz\" (UniqueName: \"kubernetes.io/projected/fd025db6-f28c-4a4c-8279-22e1a33a50ca-kube-api-access-ggxhz\") pod \"model-serving-api-86f7b4b499-ggqbz\" (UID: \"fd025db6-f28c-4a4c-8279-22e1a33a50ca\") " pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:21.090050 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.089888 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd025db6-f28c-4a4c-8279-22e1a33a50ca-tls-certs\") pod \"model-serving-api-86f7b4b499-ggqbz\" (UID: \"fd025db6-f28c-4a4c-8279-22e1a33a50ca\") " pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:21.190948 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.190921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxhz\" (UniqueName: \"kubernetes.io/projected/fd025db6-f28c-4a4c-8279-22e1a33a50ca-kube-api-access-ggxhz\") pod \"model-serving-api-86f7b4b499-ggqbz\" (UID: \"fd025db6-f28c-4a4c-8279-22e1a33a50ca\") " pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:21.191133 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.190956 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x5lh\" (UniqueName: \"kubernetes.io/projected/01933426-0fec-43f9-bb14-c5367060ee88-kube-api-access-8x5lh\") pod \"odh-model-controller-696fc77849-k5vb4\" (UID: \"01933426-0fec-43f9-bb14-c5367060ee88\") " pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:21.191133 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.190992 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd025db6-f28c-4a4c-8279-22e1a33a50ca-tls-certs\") pod \"model-serving-api-86f7b4b499-ggqbz\" (UID: \"fd025db6-f28c-4a4c-8279-22e1a33a50ca\") " pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:21.191133 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.191038 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01933426-0fec-43f9-bb14-c5367060ee88-cert\") pod \"odh-model-controller-696fc77849-k5vb4\" (UID: \"01933426-0fec-43f9-bb14-c5367060ee88\") " pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:21.193285 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.193263 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd025db6-f28c-4a4c-8279-22e1a33a50ca-tls-certs\") pod \"model-serving-api-86f7b4b499-ggqbz\" (UID: \"fd025db6-f28c-4a4c-8279-22e1a33a50ca\") " pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:21.198880 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.198860 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxhz\" (UniqueName: \"kubernetes.io/projected/fd025db6-f28c-4a4c-8279-22e1a33a50ca-kube-api-access-ggxhz\") pod \"model-serving-api-86f7b4b499-ggqbz\" (UID: \"fd025db6-f28c-4a4c-8279-22e1a33a50ca\") " pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:21.292351 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.292266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01933426-0fec-43f9-bb14-c5367060ee88-cert\") pod \"odh-model-controller-696fc77849-k5vb4\" (UID: \"01933426-0fec-43f9-bb14-c5367060ee88\") " pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:21.292351 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.292313 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x5lh\" (UniqueName: \"kubernetes.io/projected/01933426-0fec-43f9-bb14-c5367060ee88-kube-api-access-8x5lh\") pod \"odh-model-controller-696fc77849-k5vb4\" (UID: \"01933426-0fec-43f9-bb14-c5367060ee88\") " pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:21.294653 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.294632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01933426-0fec-43f9-bb14-c5367060ee88-cert\") pod \"odh-model-controller-696fc77849-k5vb4\" (UID: \"01933426-0fec-43f9-bb14-c5367060ee88\") " pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:21.300487 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.300466 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x5lh\" (UniqueName: \"kubernetes.io/projected/01933426-0fec-43f9-bb14-c5367060ee88-kube-api-access-8x5lh\") pod \"odh-model-controller-696fc77849-k5vb4\" (UID: \"01933426-0fec-43f9-bb14-c5367060ee88\") " pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:21.311618 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.311589 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:21.324946 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.324516 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:21.437187 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.437156 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-ggqbz"] Apr 22 14:24:21.440710 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:24:21.440668 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd025db6_f28c_4a4c_8279_22e1a33a50ca.slice/crio-b44c481f2d8ce1dd32c79f73bc6f28c5ea381a2f0e42ca21456a6741bc0b304f WatchSource:0}: Error finding container b44c481f2d8ce1dd32c79f73bc6f28c5ea381a2f0e42ca21456a6741bc0b304f: Status 404 returned error can't find the container with id b44c481f2d8ce1dd32c79f73bc6f28c5ea381a2f0e42ca21456a6741bc0b304f Apr 22 14:24:21.455354 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:21.455334 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-k5vb4"] Apr 22 14:24:21.457514 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:24:21.457490 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01933426_0fec_43f9_bb14_c5367060ee88.slice/crio-a23d04a76cfe10e960157000dce4465f31a1baf6eb5176e7e76f1f3731e411e0 WatchSource:0}: Error finding container a23d04a76cfe10e960157000dce4465f31a1baf6eb5176e7e76f1f3731e411e0: Status 404 returned error can't find the container with id a23d04a76cfe10e960157000dce4465f31a1baf6eb5176e7e76f1f3731e411e0 Apr 22 14:24:22.182670 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:22.179750 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-ggqbz" event={"ID":"fd025db6-f28c-4a4c-8279-22e1a33a50ca","Type":"ContainerStarted","Data":"b44c481f2d8ce1dd32c79f73bc6f28c5ea381a2f0e42ca21456a6741bc0b304f"} Apr 22 14:24:22.183133 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:22.182810 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-k5vb4" event={"ID":"01933426-0fec-43f9-bb14-c5367060ee88","Type":"ContainerStarted","Data":"a23d04a76cfe10e960157000dce4465f31a1baf6eb5176e7e76f1f3731e411e0"} Apr 22 14:24:25.193610 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:25.193573 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-ggqbz" event={"ID":"fd025db6-f28c-4a4c-8279-22e1a33a50ca","Type":"ContainerStarted","Data":"c0cab3792433a7c74ac7b4ab211392e0cf433c89129f5d8655955bc8a83ce139"} Apr 22 14:24:25.194033 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:25.193688 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:25.194846 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:25.194823 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-k5vb4" event={"ID":"01933426-0fec-43f9-bb14-c5367060ee88","Type":"ContainerStarted","Data":"5b7f4f7241058e3da2beb5c788d90880e09708fc9d85b1ba9dd4a004ea16671f"} Apr 22 14:24:25.194961 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:25.194919 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:25.212260 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:25.212215 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-ggqbz" podStartSLOduration=1.598621442 podStartE2EDuration="5.212203952s" podCreationTimestamp="2026-04-22 14:24:20 +0000 UTC" firstStartedPulling="2026-04-22 14:24:21.44245782 +0000 UTC m=+543.421969176" lastFinishedPulling="2026-04-22 14:24:25.056040327 +0000 UTC m=+547.035551686" observedRunningTime="2026-04-22 14:24:25.211898408 +0000 UTC m=+547.191409786" watchObservedRunningTime="2026-04-22 14:24:25.212203952 +0000 UTC m=+547.191715363" Apr 22 14:24:25.227821 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:25.227754 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-k5vb4" podStartSLOduration=1.626395297 podStartE2EDuration="5.227739663s" podCreationTimestamp="2026-04-22 14:24:20 +0000 UTC" firstStartedPulling="2026-04-22 14:24:21.458683436 +0000 UTC m=+543.438194792" lastFinishedPulling="2026-04-22 14:24:25.060027786 +0000 UTC m=+547.039539158" observedRunningTime="2026-04-22 14:24:25.227707232 +0000 UTC m=+547.207218610" watchObservedRunningTime="2026-04-22 14:24:25.227739663 +0000 UTC m=+547.207251023" Apr 22 14:24:36.200049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:36.200021 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-k5vb4" Apr 22 14:24:36.201957 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:36.201934 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-ggqbz" Apr 22 14:24:55.858312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:55.858281 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh"] Apr 22 14:24:55.864773 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:55.864744 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:24:55.867896 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:55.867874 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-x29k5\"" Apr 22 14:24:55.874334 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:55.874313 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh"] Apr 22 14:24:55.936864 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:55.936825 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65512743-a88c-4def-8691-ddc0f0febb55-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh\" (UID: \"65512743-a88c-4def-8691-ddc0f0febb55\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:24:56.038023 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:56.037982 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65512743-a88c-4def-8691-ddc0f0febb55-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh\" (UID: \"65512743-a88c-4def-8691-ddc0f0febb55\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:24:56.038327 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:56.038309 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65512743-a88c-4def-8691-ddc0f0febb55-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh\" (UID: \"65512743-a88c-4def-8691-ddc0f0febb55\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:24:56.174575 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:56.174489 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:24:56.300217 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:56.300190 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh"] Apr 22 14:24:56.302646 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:24:56.302615 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65512743_a88c_4def_8691_ddc0f0febb55.slice/crio-c8bff07f5561e34b2066c687fbb30846975dbf07d40e1a4fdd5e0b9507dadd50 WatchSource:0}: Error finding container c8bff07f5561e34b2066c687fbb30846975dbf07d40e1a4fdd5e0b9507dadd50: Status 404 returned error can't find the container with id c8bff07f5561e34b2066c687fbb30846975dbf07d40e1a4fdd5e0b9507dadd50 Apr 22 14:24:57.280708 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:24:57.280666 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" event={"ID":"65512743-a88c-4def-8691-ddc0f0febb55","Type":"ContainerStarted","Data":"c8bff07f5561e34b2066c687fbb30846975dbf07d40e1a4fdd5e0b9507dadd50"} Apr 22 14:25:00.290117 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:00.290081 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" event={"ID":"65512743-a88c-4def-8691-ddc0f0febb55","Type":"ContainerStarted","Data":"ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458"} Apr 22 14:25:04.302332 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:04.302298 2566 generic.go:358] "Generic (PLEG): container finished" podID="65512743-a88c-4def-8691-ddc0f0febb55" containerID="ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458" exitCode=0 Apr 22 14:25:04.302708 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:04.302362 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" event={"ID":"65512743-a88c-4def-8691-ddc0f0febb55","Type":"ContainerDied","Data":"ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458"} Apr 22 14:25:17.347094 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:17.347057 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" event={"ID":"65512743-a88c-4def-8691-ddc0f0febb55","Type":"ContainerStarted","Data":"82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9"} Apr 22 14:25:18.527161 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:18.527125 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:25:18.529151 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:18.529128 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:25:21.361291 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:21.361248 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" event={"ID":"65512743-a88c-4def-8691-ddc0f0febb55","Type":"ContainerStarted","Data":"a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936"} Apr 22 14:25:21.361736 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:21.361477 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:25:21.361736 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:21.361499 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:25:21.362763 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:21.362736 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:25:21.363353 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:21.363330 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:25:21.394422 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:21.394374 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podStartSLOduration=2.202277041 podStartE2EDuration="26.394360324s" podCreationTimestamp="2026-04-22 14:24:55 +0000 UTC" firstStartedPulling="2026-04-22 14:24:56.304381853 +0000 UTC m=+578.283893224" lastFinishedPulling="2026-04-22 14:25:20.496465141 +0000 UTC m=+602.475976507" observedRunningTime="2026-04-22 14:25:21.394070475 +0000 UTC m=+603.373581853" watchObservedRunningTime="2026-04-22 14:25:21.394360324 +0000 UTC m=+603.373871767" Apr 22 14:25:22.363972 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:22.363935 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:25:22.364412 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:22.364297 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:25:32.363899 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:32.363860 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:25:32.364380 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:32.364212 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:25:42.364028 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:42.363926 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:25:42.364392 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:42.364317 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:25:52.364579 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:52.364527 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:25:52.365057 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:25:52.365009 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:02.364937 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:02.364871 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:26:02.367520 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:02.365324 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:12.364477 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:12.364403 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:26:12.365045 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:12.364867 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:22.364686 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:22.364649 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:26:22.365059 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:22.364882 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:26:31.052276 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.052247 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh"] Apr 22 14:26:31.052788 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.052661 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" containerID="cri-o://82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9" gracePeriod=30 Apr 22 14:26:31.052788 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.052677 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" containerID="cri-o://a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936" gracePeriod=30 Apr 22 14:26:31.139709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.139675 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx"] Apr 22 14:26:31.141990 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.141966 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:26:31.156240 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.156213 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx"] Apr 22 14:26:31.227272 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.227241 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20ca612-2287-4a0f-a3fc-210e26f59ada-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx\" (UID: \"b20ca612-2287-4a0f-a3fc-210e26f59ada\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:26:31.240492 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.240460 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4"] Apr 22 14:26:31.242585 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.242570 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:26:31.258645 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.258614 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4"] Apr 22 14:26:31.327822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.327739 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20ca612-2287-4a0f-a3fc-210e26f59ada-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx\" (UID: \"b20ca612-2287-4a0f-a3fc-210e26f59ada\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:26:31.327822 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.327803 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4732b54-3b30-4a6a-9533-205f36c1d827-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4\" (UID: \"b4732b54-3b30-4a6a-9533-205f36c1d827\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:26:31.328141 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.328119 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20ca612-2287-4a0f-a3fc-210e26f59ada-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx\" (UID: \"b20ca612-2287-4a0f-a3fc-210e26f59ada\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:26:31.428715 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.428682 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4732b54-3b30-4a6a-9533-205f36c1d827-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4\" (UID: \"b4732b54-3b30-4a6a-9533-205f36c1d827\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:26:31.429054 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.429035 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4732b54-3b30-4a6a-9533-205f36c1d827-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4\" (UID: \"b4732b54-3b30-4a6a-9533-205f36c1d827\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:26:31.457240 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.457219 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:26:31.552899 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.552868 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:26:31.582003 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.581975 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx"] Apr 22 14:26:31.586977 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:26:31.586941 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb20ca612_2287_4a0f_a3fc_210e26f59ada.slice/crio-c4d4bbe15481ed77004170d829457c89d0758b6bb7f57b13bb605a9f4bff6d22 WatchSource:0}: Error finding container c4d4bbe15481ed77004170d829457c89d0758b6bb7f57b13bb605a9f4bff6d22: Status 404 returned error can't find the container with id c4d4bbe15481ed77004170d829457c89d0758b6bb7f57b13bb605a9f4bff6d22 Apr 22 14:26:31.675678 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:31.675650 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4"] Apr 22 14:26:31.678368 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:26:31.678333 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4732b54_3b30_4a6a_9533_205f36c1d827.slice/crio-b030345bffca9584b0cfcc38b32f6da3e7849a9de70b0ef6bbf6a6e0b9604e42 WatchSource:0}: Error finding container b030345bffca9584b0cfcc38b32f6da3e7849a9de70b0ef6bbf6a6e0b9604e42: Status 404 returned error can't find the container with id b030345bffca9584b0cfcc38b32f6da3e7849a9de70b0ef6bbf6a6e0b9604e42 Apr 22 14:26:32.364840 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:32.364796 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:26:32.365221 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:32.365158 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:32.552313 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:32.552271 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" event={"ID":"b4732b54-3b30-4a6a-9533-205f36c1d827","Type":"ContainerStarted","Data":"2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c"} Apr 22 14:26:32.552313 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:32.552316 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" event={"ID":"b4732b54-3b30-4a6a-9533-205f36c1d827","Type":"ContainerStarted","Data":"b030345bffca9584b0cfcc38b32f6da3e7849a9de70b0ef6bbf6a6e0b9604e42"} Apr 22 14:26:32.553588 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:32.553566 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" event={"ID":"b20ca612-2287-4a0f-a3fc-210e26f59ada","Type":"ContainerStarted","Data":"ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e"} Apr 22 14:26:32.553661 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:32.553595 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" event={"ID":"b20ca612-2287-4a0f-a3fc-210e26f59ada","Type":"ContainerStarted","Data":"c4d4bbe15481ed77004170d829457c89d0758b6bb7f57b13bb605a9f4bff6d22"} Apr 22 14:26:35.563685 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:35.563658 2566 generic.go:358] "Generic (PLEG): container finished" podID="65512743-a88c-4def-8691-ddc0f0febb55" containerID="82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9" exitCode=0 Apr 22 14:26:35.563977 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:35.563730 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" event={"ID":"65512743-a88c-4def-8691-ddc0f0febb55","Type":"ContainerDied","Data":"82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9"} Apr 22 14:26:35.564800 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:35.564781 2566 generic.go:358] "Generic (PLEG): container finished" podID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerID="2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c" exitCode=0 Apr 22 14:26:35.564882 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:35.564834 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" event={"ID":"b4732b54-3b30-4a6a-9533-205f36c1d827","Type":"ContainerDied","Data":"2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c"} Apr 22 14:26:35.566165 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:35.566143 2566 generic.go:358] "Generic (PLEG): container finished" podID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerID="ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e" exitCode=0 Apr 22 14:26:35.566246 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:35.566215 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" event={"ID":"b20ca612-2287-4a0f-a3fc-210e26f59ada","Type":"ContainerDied","Data":"ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e"} Apr 22 14:26:36.571852 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:36.571815 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" event={"ID":"b20ca612-2287-4a0f-a3fc-210e26f59ada","Type":"ContainerStarted","Data":"e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b"} Apr 22 14:26:36.572332 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:36.572309 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:26:36.573504 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:36.573469 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:26:36.590929 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:36.590876 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podStartSLOduration=5.590858784 podStartE2EDuration="5.590858784s" podCreationTimestamp="2026-04-22 14:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:26:36.590131608 +0000 UTC m=+678.569642989" watchObservedRunningTime="2026-04-22 14:26:36.590858784 +0000 UTC m=+678.570370165" Apr 22 14:26:37.575508 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:37.575472 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:26:42.364800 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:42.364727 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:26:42.365243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:42.365147 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:47.576135 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:47.576084 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:26:52.364763 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:52.364701 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 14:26:52.365260 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:52.364857 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:26:52.365260 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:52.365161 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:26:52.365366 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:52.365280 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:26:57.575510 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:57.575401 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:26:57.634807 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:57.634775 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" event={"ID":"b4732b54-3b30-4a6a-9533-205f36c1d827","Type":"ContainerStarted","Data":"0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16"} Apr 22 14:26:57.635075 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:57.635055 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:26:57.636424 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:57.636396 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 14:26:57.653292 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:57.653175 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podStartSLOduration=4.964133391 podStartE2EDuration="26.653157682s" podCreationTimestamp="2026-04-22 14:26:31 +0000 UTC" firstStartedPulling="2026-04-22 14:26:35.566026748 +0000 UTC m=+677.545538105" lastFinishedPulling="2026-04-22 14:26:57.255051028 +0000 UTC m=+699.234562396" observedRunningTime="2026-04-22 14:26:57.652917362 +0000 UTC m=+699.632428744" watchObservedRunningTime="2026-04-22 14:26:57.653157682 +0000 UTC m=+699.632669062" Apr 22 14:26:58.637784 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:26:58.637736 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 14:27:01.190472 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.190447 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:27:01.254043 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.254009 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65512743-a88c-4def-8691-ddc0f0febb55-kserve-provision-location\") pod \"65512743-a88c-4def-8691-ddc0f0febb55\" (UID: \"65512743-a88c-4def-8691-ddc0f0febb55\") " Apr 22 14:27:01.254321 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.254299 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65512743-a88c-4def-8691-ddc0f0febb55-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "65512743-a88c-4def-8691-ddc0f0febb55" (UID: "65512743-a88c-4def-8691-ddc0f0febb55"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:27:01.355185 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.355157 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65512743-a88c-4def-8691-ddc0f0febb55-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:27:01.647976 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.647886 2566 generic.go:358] "Generic (PLEG): container finished" podID="65512743-a88c-4def-8691-ddc0f0febb55" containerID="a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936" exitCode=0 Apr 22 14:27:01.647976 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.647948 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" event={"ID":"65512743-a88c-4def-8691-ddc0f0febb55","Type":"ContainerDied","Data":"a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936"} Apr 22 14:27:01.647976 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.647974 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" Apr 22 14:27:01.648225 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.647986 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh" event={"ID":"65512743-a88c-4def-8691-ddc0f0febb55","Type":"ContainerDied","Data":"c8bff07f5561e34b2066c687fbb30846975dbf07d40e1a4fdd5e0b9507dadd50"} Apr 22 14:27:01.648225 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.648002 2566 scope.go:117] "RemoveContainer" containerID="a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936" Apr 22 14:27:01.656285 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.656264 2566 scope.go:117] "RemoveContainer" containerID="82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9" Apr 22 14:27:01.663256 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.663239 2566 scope.go:117] "RemoveContainer" containerID="ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458" Apr 22 14:27:01.670224 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.670198 2566 scope.go:117] "RemoveContainer" containerID="a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936" Apr 22 14:27:01.670539 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:27:01.670511 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936\": container with ID starting with a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936 not found: ID does not exist" containerID="a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936" Apr 22 14:27:01.670614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.670554 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936"} err="failed to get container status \"a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936\": rpc error: code = NotFound desc = could not find container \"a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936\": container with ID starting with a51932ca7bbb58bf4c644a23c8b813c019a3c6a1f62df70fc32717092a477936 not found: ID does not exist" Apr 22 14:27:01.670614 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.670581 2566 scope.go:117] "RemoveContainer" containerID="82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9" Apr 22 14:27:01.670840 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:27:01.670822 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9\": container with ID starting with 82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9 not found: ID does not exist" containerID="82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9" Apr 22 14:27:01.670880 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.670847 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9"} err="failed to get container status \"82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9\": rpc error: code = NotFound desc = could not find container \"82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9\": container with ID starting with 82df6f27dd7733e40d635eba82d2f7a0ad3767f9949e2d55d75f973d91eef8f9 not found: ID does not exist" Apr 22 14:27:01.670880 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.670862 2566 scope.go:117] "RemoveContainer" containerID="ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458" Apr 22 14:27:01.671108 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:27:01.671088 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458\": container with ID starting with ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458 not found: ID does not exist" containerID="ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458" Apr 22 14:27:01.671157 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.671114 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458"} err="failed to get container status \"ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458\": rpc error: code = NotFound desc = could not find container \"ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458\": container with ID starting with ea71a1f2004745c5c4a30d6cc4f45e2e85605ef843fff0e467d2d26608450458 not found: ID does not exist" Apr 22 14:27:01.671232 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.671220 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh"] Apr 22 14:27:01.675042 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:01.675022 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8cf9f-predictor-6cd474d9cf-tqwrh"] Apr 22 14:27:02.633443 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:02.633398 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65512743-a88c-4def-8691-ddc0f0febb55" path="/var/lib/kubelet/pods/65512743-a88c-4def-8691-ddc0f0febb55/volumes" Apr 22 14:27:07.575544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:07.575457 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:27:08.638558 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:08.638514 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 14:27:17.576506 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:17.576458 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:27:18.638591 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:18.638553 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 14:27:27.575870 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:27.575823 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:27:28.638145 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:28.638099 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 14:27:37.575620 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:37.575573 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:27:38.638495 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:38.638456 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 14:27:45.630630 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:45.630600 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:27:48.638313 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:48.638274 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 14:27:58.639547 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:27:58.639515 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:28:01.176946 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.176907 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p"] Apr 22 14:28:01.177312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.177169 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="storage-initializer" Apr 22 14:28:01.177312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.177180 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="storage-initializer" Apr 22 14:28:01.177312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.177200 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" Apr 22 14:28:01.177312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.177206 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" Apr 22 14:28:01.177312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.177213 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" Apr 22 14:28:01.177312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.177220 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" Apr 22 14:28:01.177312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.177271 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="kserve-container" Apr 22 14:28:01.177312 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.177281 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="65512743-a88c-4def-8691-ddc0f0febb55" containerName="agent" Apr 22 14:28:01.180180 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.180164 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:01.185127 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.185103 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-dc953-kube-rbac-proxy-sar-config\"" Apr 22 14:28:01.185277 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.185107 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-dc953-serving-cert\"" Apr 22 14:28:01.185277 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.185106 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 14:28:01.195185 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.195161 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p"] Apr 22 14:28:01.279586 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.279556 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-openshift-service-ca-bundle\") pod \"model-chainer-raw-dc953-5bffc7f67f-lqm6p\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:01.279754 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.279593 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls\") pod \"model-chainer-raw-dc953-5bffc7f67f-lqm6p\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:01.380823 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.380767 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-openshift-service-ca-bundle\") pod \"model-chainer-raw-dc953-5bffc7f67f-lqm6p\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:01.380823 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.380831 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls\") pod \"model-chainer-raw-dc953-5bffc7f67f-lqm6p\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:01.381091 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:01.380957 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-dc953-serving-cert: secret "model-chainer-raw-dc953-serving-cert" not found Apr 22 14:28:01.381091 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:01.381027 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls podName:2f2818ac-02c9-42a0-ac83-83dc7d6d80c9 nodeName:}" failed. No retries permitted until 2026-04-22 14:28:01.881005477 +0000 UTC m=+763.860516834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls") pod "model-chainer-raw-dc953-5bffc7f67f-lqm6p" (UID: "2f2818ac-02c9-42a0-ac83-83dc7d6d80c9") : secret "model-chainer-raw-dc953-serving-cert" not found Apr 22 14:28:01.381543 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.381520 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-openshift-service-ca-bundle\") pod \"model-chainer-raw-dc953-5bffc7f67f-lqm6p\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:01.883970 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.883927 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls\") pod \"model-chainer-raw-dc953-5bffc7f67f-lqm6p\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:01.886293 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:01.886266 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls\") pod \"model-chainer-raw-dc953-5bffc7f67f-lqm6p\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:02.089785 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:02.089739 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:02.207533 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:02.207502 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p"] Apr 22 14:28:02.210418 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:28:02.210392 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2818ac_02c9_42a0_ac83_83dc7d6d80c9.slice/crio-7fc3e9150498736815027fb39ba07b17eb807cee9bc5dcf91fdf7cf512c0a7de WatchSource:0}: Error finding container 7fc3e9150498736815027fb39ba07b17eb807cee9bc5dcf91fdf7cf512c0a7de: Status 404 returned error can't find the container with id 7fc3e9150498736815027fb39ba07b17eb807cee9bc5dcf91fdf7cf512c0a7de Apr 22 14:28:02.810639 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:02.810590 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" event={"ID":"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9","Type":"ContainerStarted","Data":"7fc3e9150498736815027fb39ba07b17eb807cee9bc5dcf91fdf7cf512c0a7de"} Apr 22 14:28:04.817409 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:04.817319 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" event={"ID":"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9","Type":"ContainerStarted","Data":"83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078"} Apr 22 14:28:04.817409 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:04.817379 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:04.836478 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:04.836410 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" podStartSLOduration=1.572349289 podStartE2EDuration="3.836395332s" podCreationTimestamp="2026-04-22 14:28:01 +0000 UTC" firstStartedPulling="2026-04-22 14:28:02.212247944 +0000 UTC m=+764.191759301" lastFinishedPulling="2026-04-22 14:28:04.476293974 +0000 UTC m=+766.455805344" observedRunningTime="2026-04-22 14:28:04.835053522 +0000 UTC m=+766.814564899" watchObservedRunningTime="2026-04-22 14:28:04.836395332 +0000 UTC m=+766.815906709" Apr 22 14:28:10.826097 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:10.826070 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:11.284949 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.284917 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p"] Apr 22 14:28:11.285165 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.285116 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" containerID="cri-o://83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078" gracePeriod=30 Apr 22 14:28:11.452162 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.452127 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx"] Apr 22 14:28:11.452418 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.452379 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" containerID="cri-o://e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b" gracePeriod=30 Apr 22 14:28:11.486945 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.486914 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8"] Apr 22 14:28:11.489925 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.489907 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:28:11.504160 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.504137 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8"] Apr 22 14:28:11.560007 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.559915 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60ab39b6-b02c-444b-86e9-400d6ac96d03-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8\" (UID: \"60ab39b6-b02c-444b-86e9-400d6ac96d03\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:28:11.564037 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.564009 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9"] Apr 22 14:28:11.567171 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.567154 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:28:11.579227 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.579203 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9"] Apr 22 14:28:11.624182 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.624149 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4"] Apr 22 14:28:11.624538 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.624484 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" containerID="cri-o://0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16" gracePeriod=30 Apr 22 14:28:11.661322 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.661283 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae5343d-0bc9-4a20-939c-3e96db7446ed-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9\" (UID: \"4ae5343d-0bc9-4a20-939c-3e96db7446ed\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:28:11.661510 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.661333 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60ab39b6-b02c-444b-86e9-400d6ac96d03-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8\" (UID: \"60ab39b6-b02c-444b-86e9-400d6ac96d03\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:28:11.661717 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.661696 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60ab39b6-b02c-444b-86e9-400d6ac96d03-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8\" (UID: \"60ab39b6-b02c-444b-86e9-400d6ac96d03\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:28:11.762502 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.762466 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae5343d-0bc9-4a20-939c-3e96db7446ed-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9\" (UID: \"4ae5343d-0bc9-4a20-939c-3e96db7446ed\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:28:11.762841 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.762822 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae5343d-0bc9-4a20-939c-3e96db7446ed-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9\" (UID: \"4ae5343d-0bc9-4a20-939c-3e96db7446ed\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:28:11.799401 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.799364 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:28:11.876812 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.876773 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:28:11.923116 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:11.923079 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8"] Apr 22 14:28:12.017504 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:12.017479 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9"] Apr 22 14:28:12.019917 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:28:12.019887 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae5343d_0bc9_4a20_939c_3e96db7446ed.slice/crio-b2115dbca163bae4ae56297f3f0b9a0e145240960aa0bedf77558c82e627ffad WatchSource:0}: Error finding container b2115dbca163bae4ae56297f3f0b9a0e145240960aa0bedf77558c82e627ffad: Status 404 returned error can't find the container with id b2115dbca163bae4ae56297f3f0b9a0e145240960aa0bedf77558c82e627ffad Apr 22 14:28:12.840875 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:12.840837 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" event={"ID":"4ae5343d-0bc9-4a20-939c-3e96db7446ed","Type":"ContainerStarted","Data":"58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c"} Apr 22 14:28:12.840875 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:12.840878 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" event={"ID":"4ae5343d-0bc9-4a20-939c-3e96db7446ed","Type":"ContainerStarted","Data":"b2115dbca163bae4ae56297f3f0b9a0e145240960aa0bedf77558c82e627ffad"} Apr 22 14:28:12.842159 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:12.842133 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" event={"ID":"60ab39b6-b02c-444b-86e9-400d6ac96d03","Type":"ContainerStarted","Data":"71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28"} Apr 22 14:28:12.842273 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:12.842165 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" event={"ID":"60ab39b6-b02c-444b-86e9-400d6ac96d03","Type":"ContainerStarted","Data":"84ccaed7301991a9085699bfff41b3d769592d99c7543392be7cda1b9a11b462"} Apr 22 14:28:15.629794 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.629732 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 14:28:15.694464 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.694429 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:28:15.793216 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.793122 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4732b54-3b30-4a6a-9533-205f36c1d827-kserve-provision-location\") pod \"b4732b54-3b30-4a6a-9533-205f36c1d827\" (UID: \"b4732b54-3b30-4a6a-9533-205f36c1d827\") " Apr 22 14:28:15.793503 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.793476 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4732b54-3b30-4a6a-9533-205f36c1d827-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b4732b54-3b30-4a6a-9533-205f36c1d827" (UID: "b4732b54-3b30-4a6a-9533-205f36c1d827"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:15.825090 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.825051 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:15.849829 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.849798 2566 generic.go:358] "Generic (PLEG): container finished" podID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerID="71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28" exitCode=0 Apr 22 14:28:15.849989 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.849866 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" event={"ID":"60ab39b6-b02c-444b-86e9-400d6ac96d03","Type":"ContainerDied","Data":"71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28"} Apr 22 14:28:15.850922 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.850903 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:28:15.851287 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.851265 2566 generic.go:358] "Generic (PLEG): container finished" podID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerID="0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16" exitCode=0 Apr 22 14:28:15.851392 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.851323 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" Apr 22 14:28:15.851392 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.851348 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" event={"ID":"b4732b54-3b30-4a6a-9533-205f36c1d827","Type":"ContainerDied","Data":"0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16"} Apr 22 14:28:15.851392 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.851376 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4" event={"ID":"b4732b54-3b30-4a6a-9533-205f36c1d827","Type":"ContainerDied","Data":"b030345bffca9584b0cfcc38b32f6da3e7849a9de70b0ef6bbf6a6e0b9604e42"} Apr 22 14:28:15.851534 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.851395 2566 scope.go:117] "RemoveContainer" containerID="0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16" Apr 22 14:28:15.852694 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.852537 2566 generic.go:358] "Generic (PLEG): container finished" podID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerID="58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c" exitCode=0 Apr 22 14:28:15.852694 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.852566 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" event={"ID":"4ae5343d-0bc9-4a20-939c-3e96db7446ed","Type":"ContainerDied","Data":"58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c"} Apr 22 14:28:15.874086 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.874058 2566 scope.go:117] "RemoveContainer" containerID="2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c" Apr 22 14:28:15.892810 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.892789 2566 scope.go:117] "RemoveContainer" containerID="0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16" Apr 22 14:28:15.893146 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:15.893123 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16\": container with ID starting with 0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16 not found: ID does not exist" containerID="0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16" Apr 22 14:28:15.893229 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.893153 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16"} err="failed to get container status \"0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16\": rpc error: code = NotFound desc = could not find container \"0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16\": container with ID starting with 0e8517bfebd11806b4fb472ecca9646b001b0c5a1cbbe1e642da93749dcb5e16 not found: ID does not exist" Apr 22 14:28:15.893229 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.893173 2566 scope.go:117] "RemoveContainer" containerID="2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c" Apr 22 14:28:15.893498 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:15.893475 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c\": container with ID starting with 2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c not found: ID does not exist" containerID="2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c" Apr 22 14:28:15.893604 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.893509 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c"} err="failed to get container status \"2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c\": rpc error: code = NotFound desc = could not find container \"2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c\": container with ID starting with 2ff734216cbf39e414f17bbc5a348dc7613055d9e4cf3fa9183aa0a2735ee17c not found: ID does not exist" Apr 22 14:28:15.893726 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.893704 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4732b54-3b30-4a6a-9533-205f36c1d827-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:28:15.903260 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.903237 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4"] Apr 22 14:28:15.906568 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:15.906542 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-dc953-predictor-b86bc8b59-rxlq4"] Apr 22 14:28:16.289320 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.289291 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:28:16.397542 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.397454 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20ca612-2287-4a0f-a3fc-210e26f59ada-kserve-provision-location\") pod \"b20ca612-2287-4a0f-a3fc-210e26f59ada\" (UID: \"b20ca612-2287-4a0f-a3fc-210e26f59ada\") " Apr 22 14:28:16.397798 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.397763 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20ca612-2287-4a0f-a3fc-210e26f59ada-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b20ca612-2287-4a0f-a3fc-210e26f59ada" (UID: "b20ca612-2287-4a0f-a3fc-210e26f59ada"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:16.498671 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.498631 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b20ca612-2287-4a0f-a3fc-210e26f59ada-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:28:16.633019 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.632984 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" path="/var/lib/kubelet/pods/b4732b54-3b30-4a6a-9533-205f36c1d827/volumes" Apr 22 14:28:16.856914 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.856882 2566 generic.go:358] "Generic (PLEG): container finished" podID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerID="e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b" exitCode=0 Apr 22 14:28:16.857075 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.856943 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" Apr 22 14:28:16.857075 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.856950 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" event={"ID":"b20ca612-2287-4a0f-a3fc-210e26f59ada","Type":"ContainerDied","Data":"e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b"} Apr 22 14:28:16.857075 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.856995 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx" event={"ID":"b20ca612-2287-4a0f-a3fc-210e26f59ada","Type":"ContainerDied","Data":"c4d4bbe15481ed77004170d829457c89d0758b6bb7f57b13bb605a9f4bff6d22"} Apr 22 14:28:16.857075 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.857018 2566 scope.go:117] "RemoveContainer" containerID="e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b" Apr 22 14:28:16.858601 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.858535 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" event={"ID":"4ae5343d-0bc9-4a20-939c-3e96db7446ed","Type":"ContainerStarted","Data":"8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0"} Apr 22 14:28:16.858850 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.858829 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:28:16.860102 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.860077 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 14:28:16.860416 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.860399 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" event={"ID":"60ab39b6-b02c-444b-86e9-400d6ac96d03","Type":"ContainerStarted","Data":"3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201"} Apr 22 14:28:16.860653 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.860632 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:28:16.861493 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.861469 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 14:28:16.864287 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.864270 2566 scope.go:117] "RemoveContainer" containerID="ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e" Apr 22 14:28:16.870863 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.870846 2566 scope.go:117] "RemoveContainer" containerID="e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b" Apr 22 14:28:16.871149 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:16.871124 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b\": container with ID starting with e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b not found: ID does not exist" containerID="e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b" Apr 22 14:28:16.871197 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.871160 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b"} err="failed to get container status \"e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b\": rpc error: code = NotFound desc = could not find container \"e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b\": container with ID starting with e902e8e51d0560c4c59121e2a7019cfbaebcddde19b9e72436023876c7471e7b not found: ID does not exist" Apr 22 14:28:16.871197 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.871184 2566 scope.go:117] "RemoveContainer" containerID="ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e" Apr 22 14:28:16.871396 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:16.871379 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e\": container with ID starting with ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e not found: ID does not exist" containerID="ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e" Apr 22 14:28:16.871462 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.871404 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e"} err="failed to get container status \"ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e\": rpc error: code = NotFound desc = could not find container \"ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e\": container with ID starting with ae85021e5aef950c0b9c5da6b921d4207ccb3611ad3750c98b1a33b26ca8095e not found: ID does not exist" Apr 22 14:28:16.880618 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.880581 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podStartSLOduration=5.880553188 podStartE2EDuration="5.880553188s" podCreationTimestamp="2026-04-22 14:28:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:28:16.879735563 +0000 UTC m=+778.859246947" watchObservedRunningTime="2026-04-22 14:28:16.880553188 +0000 UTC m=+778.860064568" Apr 22 14:28:16.892173 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.892149 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx"] Apr 22 14:28:16.895767 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.895749 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-dc953-predictor-559867c7-vt6bx"] Apr 22 14:28:16.911415 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:16.911372 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podStartSLOduration=5.911359632 podStartE2EDuration="5.911359632s" podCreationTimestamp="2026-04-22 14:28:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:28:16.910231452 +0000 UTC m=+778.889742829" watchObservedRunningTime="2026-04-22 14:28:16.911359632 +0000 UTC m=+778.890871009" Apr 22 14:28:17.864548 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:17.864503 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 14:28:17.864942 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:17.864503 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 14:28:18.632628 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:18.632591 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" path="/var/lib/kubelet/pods/b20ca612-2287-4a0f-a3fc-210e26f59ada/volumes" Apr 22 14:28:20.824452 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:20.824396 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:25.825618 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:25.825572 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:25.826040 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:25.825678 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:27.865263 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:27.865221 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 14:28:27.865681 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:27.865219 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 14:28:30.824881 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:30.824824 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:35.825313 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:35.825276 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:37.864813 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:37.864729 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 14:28:37.864813 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:37.864727 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 14:28:40.826088 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:40.826049 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:28:41.309555 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:41.309526 2566 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2818ac_02c9_42a0_ac83_83dc7d6d80c9.slice/crio-conmon-83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:28:41.309680 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:41.309658 2566 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2818ac_02c9_42a0_ac83_83dc7d6d80c9.slice/crio-conmon-83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:28:41.417592 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.417567 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:41.476044 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.476010 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-openshift-service-ca-bundle\") pod \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " Apr 22 14:28:41.476220 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.476073 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls\") pod \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\" (UID: \"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9\") " Apr 22 14:28:41.476397 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.476371 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" (UID: "2f2818ac-02c9-42a0-ac83-83dc7d6d80c9"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:28:41.478223 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.478201 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" (UID: "2f2818ac-02c9-42a0-ac83-83dc7d6d80c9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:28:41.576801 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.576705 2566 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-openshift-service-ca-bundle\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:28:41.576801 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.576745 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9-proxy-tls\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:28:41.933780 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.933745 2566 generic.go:358] "Generic (PLEG): container finished" podID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerID="83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078" exitCode=0 Apr 22 14:28:41.934208 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.933790 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" event={"ID":"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9","Type":"ContainerDied","Data":"83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078"} Apr 22 14:28:41.934208 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.933813 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" Apr 22 14:28:41.934208 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.933836 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p" event={"ID":"2f2818ac-02c9-42a0-ac83-83dc7d6d80c9","Type":"ContainerDied","Data":"7fc3e9150498736815027fb39ba07b17eb807cee9bc5dcf91fdf7cf512c0a7de"} Apr 22 14:28:41.934208 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.933857 2566 scope.go:117] "RemoveContainer" containerID="83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078" Apr 22 14:28:41.941152 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.941037 2566 scope.go:117] "RemoveContainer" containerID="83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078" Apr 22 14:28:41.941371 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:28:41.941345 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078\": container with ID starting with 83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078 not found: ID does not exist" containerID="83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078" Apr 22 14:28:41.941467 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.941384 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078"} err="failed to get container status \"83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078\": rpc error: code = NotFound desc = could not find container \"83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078\": container with ID starting with 83ecfdf9cfa37a6b2508fbc1c2827dd422ce237eb490313ec50e3616e079c078 not found: ID does not exist" Apr 22 14:28:41.955164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.955138 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p"] Apr 22 14:28:41.958835 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:41.958813 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-dc953-5bffc7f67f-lqm6p"] Apr 22 14:28:42.633041 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:42.633007 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" path="/var/lib/kubelet/pods/2f2818ac-02c9-42a0-ac83-83dc7d6d80c9/volumes" Apr 22 14:28:47.865467 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:47.865404 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 14:28:47.865848 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:47.865403 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 14:28:57.865011 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:57.864968 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 14:28:57.865391 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:28:57.864968 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 14:29:07.865416 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:07.865360 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 14:29:07.865982 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:07.865375 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 14:29:17.864584 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:17.864521 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 14:29:17.865548 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:17.865522 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:29:27.866117 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:27.866082 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:29:41.506644 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.506605 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b"] Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.506958 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="storage-initializer" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.506974 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="storage-initializer" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.506984 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.506992 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507004 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="storage-initializer" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507013 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="storage-initializer" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507024 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507031 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507044 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507052 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507150 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f2818ac-02c9-42a0-ac83-83dc7d6d80c9" containerName="model-chainer-raw-dc953" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507163 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20ca612-2287-4a0f-a3fc-210e26f59ada" containerName="kserve-container" Apr 22 14:29:41.507179 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.507171 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4732b54-3b30-4a6a-9533-205f36c1d827" containerName="kserve-container" Apr 22 14:29:41.509934 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.509912 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:41.512626 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.512605 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-2ea51-serving-cert\"" Apr 22 14:29:41.512793 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.512764 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-2ea51-kube-rbac-proxy-sar-config\"" Apr 22 14:29:41.512927 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.512905 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 14:29:41.518393 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.518374 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b"] Apr 22 14:29:41.704373 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.704334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21106ddb-46e3-44d5-8124-34f5001722b5-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b\" (UID: \"21106ddb-46e3-44d5-8124-34f5001722b5\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:41.704561 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.704414 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21106ddb-46e3-44d5-8124-34f5001722b5-proxy-tls\") pod \"model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b\" (UID: \"21106ddb-46e3-44d5-8124-34f5001722b5\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:41.805251 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.805165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21106ddb-46e3-44d5-8124-34f5001722b5-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b\" (UID: \"21106ddb-46e3-44d5-8124-34f5001722b5\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:41.805251 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.805213 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21106ddb-46e3-44d5-8124-34f5001722b5-proxy-tls\") pod \"model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b\" (UID: \"21106ddb-46e3-44d5-8124-34f5001722b5\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:41.805835 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.805800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21106ddb-46e3-44d5-8124-34f5001722b5-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b\" (UID: \"21106ddb-46e3-44d5-8124-34f5001722b5\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:41.807590 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.807575 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21106ddb-46e3-44d5-8124-34f5001722b5-proxy-tls\") pod \"model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b\" (UID: \"21106ddb-46e3-44d5-8124-34f5001722b5\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:41.820290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.820269 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:41.934884 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:41.934859 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b"] Apr 22 14:29:41.937330 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:29:41.937301 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21106ddb_46e3_44d5_8124_34f5001722b5.slice/crio-c8f3fcb7d8f149819b2bec9d516f0e5dcd12ccd3f55316427527c0f58d85e12b WatchSource:0}: Error finding container c8f3fcb7d8f149819b2bec9d516f0e5dcd12ccd3f55316427527c0f58d85e12b: Status 404 returned error can't find the container with id c8f3fcb7d8f149819b2bec9d516f0e5dcd12ccd3f55316427527c0f58d85e12b Apr 22 14:29:42.096024 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:42.095938 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" event={"ID":"21106ddb-46e3-44d5-8124-34f5001722b5","Type":"ContainerStarted","Data":"cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf"} Apr 22 14:29:42.096024 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:42.095973 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" event={"ID":"21106ddb-46e3-44d5-8124-34f5001722b5","Type":"ContainerStarted","Data":"c8f3fcb7d8f149819b2bec9d516f0e5dcd12ccd3f55316427527c0f58d85e12b"} Apr 22 14:29:42.096024 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:42.095998 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:42.113680 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:42.113641 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" podStartSLOduration=1.113629081 podStartE2EDuration="1.113629081s" podCreationTimestamp="2026-04-22 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:29:42.113467241 +0000 UTC m=+864.092978617" watchObservedRunningTime="2026-04-22 14:29:42.113629081 +0000 UTC m=+864.093140460" Apr 22 14:29:48.106282 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:48.106251 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:29:51.567167 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.567140 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b"] Apr 22 14:29:51.567569 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.567356 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" containerID="cri-o://cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf" gracePeriod=30 Apr 22 14:29:51.711087 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.711055 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8"] Apr 22 14:29:51.711350 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.711328 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" containerID="cri-o://3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201" gracePeriod=30 Apr 22 14:29:51.748811 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.748778 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl"] Apr 22 14:29:51.751987 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.751971 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" Apr 22 14:29:51.759992 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.759971 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl"] Apr 22 14:29:51.761663 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.761647 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" Apr 22 14:29:51.844761 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.844722 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9"] Apr 22 14:29:51.845044 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.845019 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" containerID="cri-o://8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0" gracePeriod=30 Apr 22 14:29:51.883096 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:51.883072 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl"] Apr 22 14:29:51.885110 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:29:51.885076 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff0fcbf_3994_4c6d_b36d_4520955398b2.slice/crio-2496e2d113345020d09ef6e6ae323e9c4e4985e630a78ade84dc2fed61273a7d WatchSource:0}: Error finding container 2496e2d113345020d09ef6e6ae323e9c4e4985e630a78ade84dc2fed61273a7d: Status 404 returned error can't find the container with id 2496e2d113345020d09ef6e6ae323e9c4e4985e630a78ade84dc2fed61273a7d Apr 22 14:29:52.124265 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:52.124226 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" event={"ID":"cff0fcbf-3994-4c6d-b36d-4520955398b2","Type":"ContainerStarted","Data":"2496e2d113345020d09ef6e6ae323e9c4e4985e630a78ade84dc2fed61273a7d"} Apr 22 14:29:53.104540 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:53.104496 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:29:53.128705 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:53.128667 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" event={"ID":"cff0fcbf-3994-4c6d-b36d-4520955398b2","Type":"ContainerStarted","Data":"501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b"} Apr 22 14:29:53.128876 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:53.128834 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" Apr 22 14:29:53.130704 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:53.130682 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" Apr 22 14:29:53.150246 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:53.150201 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" podStartSLOduration=1.130770331 podStartE2EDuration="2.150188577s" podCreationTimestamp="2026-04-22 14:29:51 +0000 UTC" firstStartedPulling="2026-04-22 14:29:51.886916023 +0000 UTC m=+873.866427379" lastFinishedPulling="2026-04-22 14:29:52.906334254 +0000 UTC m=+874.885845625" observedRunningTime="2026-04-22 14:29:53.148691822 +0000 UTC m=+875.128203200" watchObservedRunningTime="2026-04-22 14:29:53.150188577 +0000 UTC m=+875.129699954" Apr 22 14:29:55.478382 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:55.478358 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:29:55.497064 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:55.497039 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae5343d-0bc9-4a20-939c-3e96db7446ed-kserve-provision-location\") pod \"4ae5343d-0bc9-4a20-939c-3e96db7446ed\" (UID: \"4ae5343d-0bc9-4a20-939c-3e96db7446ed\") " Apr 22 14:29:55.497358 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:55.497332 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae5343d-0bc9-4a20-939c-3e96db7446ed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ae5343d-0bc9-4a20-939c-3e96db7446ed" (UID: "4ae5343d-0bc9-4a20-939c-3e96db7446ed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:55.598258 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:55.598234 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae5343d-0bc9-4a20-939c-3e96db7446ed-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:29:56.139505 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.139477 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:29:56.141241 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.141211 2566 generic.go:358] "Generic (PLEG): container finished" podID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerID="8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0" exitCode=0 Apr 22 14:29:56.141339 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.141291 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" Apr 22 14:29:56.141339 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.141290 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" event={"ID":"4ae5343d-0bc9-4a20-939c-3e96db7446ed","Type":"ContainerDied","Data":"8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0"} Apr 22 14:29:56.141339 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.141333 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9" event={"ID":"4ae5343d-0bc9-4a20-939c-3e96db7446ed","Type":"ContainerDied","Data":"b2115dbca163bae4ae56297f3f0b9a0e145240960aa0bedf77558c82e627ffad"} Apr 22 14:29:56.141490 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.141353 2566 scope.go:117] "RemoveContainer" containerID="8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0" Apr 22 14:29:56.142611 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.142580 2566 generic.go:358] "Generic (PLEG): container finished" podID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerID="3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201" exitCode=0 Apr 22 14:29:56.142686 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.142631 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" event={"ID":"60ab39b6-b02c-444b-86e9-400d6ac96d03","Type":"ContainerDied","Data":"3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201"} Apr 22 14:29:56.142686 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.142657 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" event={"ID":"60ab39b6-b02c-444b-86e9-400d6ac96d03","Type":"ContainerDied","Data":"84ccaed7301991a9085699bfff41b3d769592d99c7543392be7cda1b9a11b462"} Apr 22 14:29:56.142686 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.142664 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8" Apr 22 14:29:56.148632 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.148507 2566 scope.go:117] "RemoveContainer" containerID="58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c" Apr 22 14:29:56.154914 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.154900 2566 scope.go:117] "RemoveContainer" containerID="8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0" Apr 22 14:29:56.155143 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:29:56.155126 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0\": container with ID starting with 8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0 not found: ID does not exist" containerID="8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0" Apr 22 14:29:56.155180 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.155152 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0"} err="failed to get container status \"8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0\": rpc error: code = NotFound desc = could not find container \"8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0\": container with ID starting with 8935d18fdeb4ee1c476310e50b7dd4f591d42de09561118ef79acddba2eb82e0 not found: ID does not exist" Apr 22 14:29:56.155180 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.155168 2566 scope.go:117] "RemoveContainer" containerID="58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c" Apr 22 14:29:56.155404 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:29:56.155387 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c\": container with ID starting with 58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c not found: ID does not exist" containerID="58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c" Apr 22 14:29:56.155453 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.155410 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c"} err="failed to get container status \"58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c\": rpc error: code = NotFound desc = could not find container \"58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c\": container with ID starting with 58d425c79fec034ef3d42fe82f54cb2618ae1cb98935522582e661f6db38de9c not found: ID does not exist" Apr 22 14:29:56.155496 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.155427 2566 scope.go:117] "RemoveContainer" containerID="3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201" Apr 22 14:29:56.162809 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.162793 2566 scope.go:117] "RemoveContainer" containerID="71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28" Apr 22 14:29:56.169234 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.169216 2566 scope.go:117] "RemoveContainer" containerID="3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201" Apr 22 14:29:56.169516 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:29:56.169495 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201\": container with ID starting with 3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201 not found: ID does not exist" containerID="3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201" Apr 22 14:29:56.169609 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.169527 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201"} err="failed to get container status \"3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201\": rpc error: code = NotFound desc = could not find container \"3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201\": container with ID starting with 3dd686e227f8a20a107c637f662e48db5a08315068589474f5427661939b5201 not found: ID does not exist" Apr 22 14:29:56.169609 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.169550 2566 scope.go:117] "RemoveContainer" containerID="71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28" Apr 22 14:29:56.169927 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:29:56.169890 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28\": container with ID starting with 71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28 not found: ID does not exist" containerID="71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28" Apr 22 14:29:56.169973 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.169935 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28"} err="failed to get container status \"71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28\": rpc error: code = NotFound desc = could not find container \"71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28\": container with ID starting with 71220e10222199618944a6c1d5128c44e050c13a64afb4c6ac40d2c414accc28 not found: ID does not exist" Apr 22 14:29:56.174239 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.174182 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9"] Apr 22 14:29:56.178701 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.178683 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2ea51-predictor-5d9d79f9c5-n8nc9"] Apr 22 14:29:56.202931 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.202915 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60ab39b6-b02c-444b-86e9-400d6ac96d03-kserve-provision-location\") pod \"60ab39b6-b02c-444b-86e9-400d6ac96d03\" (UID: \"60ab39b6-b02c-444b-86e9-400d6ac96d03\") " Apr 22 14:29:56.203184 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.203165 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60ab39b6-b02c-444b-86e9-400d6ac96d03-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "60ab39b6-b02c-444b-86e9-400d6ac96d03" (UID: "60ab39b6-b02c-444b-86e9-400d6ac96d03"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:56.304046 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.304013 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60ab39b6-b02c-444b-86e9-400d6ac96d03-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:29:56.475169 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.475138 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8"] Apr 22 14:29:56.482061 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.482037 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2ea51-predictor-784c998465-snqg8"] Apr 22 14:29:56.633297 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.633263 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" path="/var/lib/kubelet/pods/4ae5343d-0bc9-4a20-939c-3e96db7446ed/volumes" Apr 22 14:29:56.633715 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:56.633695 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" path="/var/lib/kubelet/pods/60ab39b6-b02c-444b-86e9-400d6ac96d03/volumes" Apr 22 14:29:58.105148 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:29:58.105109 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:01.803046 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803012 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5"] Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803257 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="storage-initializer" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803267 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="storage-initializer" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803278 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="storage-initializer" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803283 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="storage-initializer" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803297 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803304 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803314 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803319 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803364 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="60ab39b6-b02c-444b-86e9-400d6ac96d03" containerName="kserve-container" Apr 22 14:30:01.803512 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.803373 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ae5343d-0bc9-4a20-939c-3e96db7446ed" containerName="kserve-container" Apr 22 14:30:01.807960 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.807939 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:30:01.820595 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.820569 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5"] Apr 22 14:30:01.844250 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.844221 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b34a83e-7a85-410d-94ec-cf0e565f428f-kserve-provision-location\") pod \"isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5\" (UID: \"0b34a83e-7a85-410d-94ec-cf0e565f428f\") " pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:30:01.944997 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.944951 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b34a83e-7a85-410d-94ec-cf0e565f428f-kserve-provision-location\") pod \"isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5\" (UID: \"0b34a83e-7a85-410d-94ec-cf0e565f428f\") " pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:30:01.945335 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:01.945311 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b34a83e-7a85-410d-94ec-cf0e565f428f-kserve-provision-location\") pod \"isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5\" (UID: \"0b34a83e-7a85-410d-94ec-cf0e565f428f\") " pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:30:02.138618 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:02.138573 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:30:02.256666 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:02.256612 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5"] Apr 22 14:30:02.260448 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:30:02.260407 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b34a83e_7a85_410d_94ec_cf0e565f428f.slice/crio-2d33ec71f37ee208ddffb989a3463f4059f8ab02836ffd2aa5ef55afaf259d13 WatchSource:0}: Error finding container 2d33ec71f37ee208ddffb989a3463f4059f8ab02836ffd2aa5ef55afaf259d13: Status 404 returned error can't find the container with id 2d33ec71f37ee208ddffb989a3463f4059f8ab02836ffd2aa5ef55afaf259d13 Apr 22 14:30:03.104271 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:03.104230 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:03.104698 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:03.104371 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:30:03.166371 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:03.166335 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" event={"ID":"0b34a83e-7a85-410d-94ec-cf0e565f428f","Type":"ContainerStarted","Data":"c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b"} Apr 22 14:30:03.166371 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:03.166371 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" event={"ID":"0b34a83e-7a85-410d-94ec-cf0e565f428f","Type":"ContainerStarted","Data":"2d33ec71f37ee208ddffb989a3463f4059f8ab02836ffd2aa5ef55afaf259d13"} Apr 22 14:30:07.178252 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:07.178222 2566 generic.go:358] "Generic (PLEG): container finished" podID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerID="c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b" exitCode=0 Apr 22 14:30:07.178666 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:07.178298 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" event={"ID":"0b34a83e-7a85-410d-94ec-cf0e565f428f","Type":"ContainerDied","Data":"c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b"} Apr 22 14:30:08.104482 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:08.104421 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:08.182772 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:08.182735 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" event={"ID":"0b34a83e-7a85-410d-94ec-cf0e565f428f","Type":"ContainerStarted","Data":"ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b"} Apr 22 14:30:08.182772 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:08.182778 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" event={"ID":"0b34a83e-7a85-410d-94ec-cf0e565f428f","Type":"ContainerStarted","Data":"54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f"} Apr 22 14:30:08.183180 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:08.183067 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:30:08.183180 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:08.183095 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:30:08.184463 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:08.184421 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:30:08.185084 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:08.185062 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:08.202376 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:08.202336 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podStartSLOduration=7.20232647 podStartE2EDuration="7.20232647s" podCreationTimestamp="2026-04-22 14:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:30:08.200681151 +0000 UTC m=+890.180192530" watchObservedRunningTime="2026-04-22 14:30:08.20232647 +0000 UTC m=+890.181837847" Apr 22 14:30:09.185598 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:09.185561 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:30:09.185990 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:09.185878 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:13.104297 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:13.104258 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:18.104415 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:18.104375 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:18.547764 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:18.547680 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:30:18.548108 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:18.548085 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:30:19.186401 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:19.186348 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:30:19.186841 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:19.186809 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:21.603811 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:30:21.603775 2566 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21106ddb_46e3_44d5_8124_34f5001722b5.slice/crio-cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21106ddb_46e3_44d5_8124_34f5001722b5.slice/crio-conmon-cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:30:21.711487 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:21.711464 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:30:21.779486 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:21.779411 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21106ddb-46e3-44d5-8124-34f5001722b5-proxy-tls\") pod \"21106ddb-46e3-44d5-8124-34f5001722b5\" (UID: \"21106ddb-46e3-44d5-8124-34f5001722b5\") " Apr 22 14:30:21.779655 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:21.779558 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21106ddb-46e3-44d5-8124-34f5001722b5-openshift-service-ca-bundle\") pod \"21106ddb-46e3-44d5-8124-34f5001722b5\" (UID: \"21106ddb-46e3-44d5-8124-34f5001722b5\") " Apr 22 14:30:21.779934 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:21.779907 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21106ddb-46e3-44d5-8124-34f5001722b5-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "21106ddb-46e3-44d5-8124-34f5001722b5" (UID: "21106ddb-46e3-44d5-8124-34f5001722b5"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:30:21.781567 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:21.781544 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21106ddb-46e3-44d5-8124-34f5001722b5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "21106ddb-46e3-44d5-8124-34f5001722b5" (UID: "21106ddb-46e3-44d5-8124-34f5001722b5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:30:21.880930 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:21.880882 2566 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21106ddb-46e3-44d5-8124-34f5001722b5-openshift-service-ca-bundle\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:30:21.880930 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:21.880927 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21106ddb-46e3-44d5-8124-34f5001722b5-proxy-tls\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:30:22.221749 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.221651 2566 generic.go:358] "Generic (PLEG): container finished" podID="21106ddb-46e3-44d5-8124-34f5001722b5" containerID="cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf" exitCode=137 Apr 22 14:30:22.221749 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.221742 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" Apr 22 14:30:22.221982 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.221735 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" event={"ID":"21106ddb-46e3-44d5-8124-34f5001722b5","Type":"ContainerDied","Data":"cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf"} Apr 22 14:30:22.221982 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.221784 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b" event={"ID":"21106ddb-46e3-44d5-8124-34f5001722b5","Type":"ContainerDied","Data":"c8f3fcb7d8f149819b2bec9d516f0e5dcd12ccd3f55316427527c0f58d85e12b"} Apr 22 14:30:22.221982 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.221804 2566 scope.go:117] "RemoveContainer" containerID="cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf" Apr 22 14:30:22.230143 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.230121 2566 scope.go:117] "RemoveContainer" containerID="cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf" Apr 22 14:30:22.230421 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:30:22.230398 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf\": container with ID starting with cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf not found: ID does not exist" containerID="cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf" Apr 22 14:30:22.230544 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.230451 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf"} err="failed to get container status \"cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf\": rpc error: code = NotFound desc = could not find container \"cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf\": container with ID starting with cb8652abff90c7073394df880a514305bf4ca17df391ce893ef2a12e6d858daf not found: ID does not exist" Apr 22 14:30:22.243070 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.243046 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b"] Apr 22 14:30:22.246667 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.246646 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-2ea51-76f765ddd7-ftp5b"] Apr 22 14:30:22.633797 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:22.633760 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" path="/var/lib/kubelet/pods/21106ddb-46e3-44d5-8124-34f5001722b5/volumes" Apr 22 14:30:29.186004 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:29.185958 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:30:29.186556 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:29.186394 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:39.186265 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:39.186213 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:30:39.186715 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:39.186686 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:49.186062 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:49.186019 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:30:49.186596 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:49.186473 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:30:59.186171 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:59.186125 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:30:59.186655 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:30:59.186632 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:31:09.185672 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:09.185620 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:31:09.187933 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:09.186079 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:31:09.629907 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:09.629865 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:31:09.630263 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:09.630240 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:31:19.630623 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:19.630593 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:31:19.631021 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:19.630715 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:31:26.818015 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:26.817986 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl_cff0fcbf-3994-4c6d-b36d-4520955398b2/kserve-container/0.log" Apr 22 14:31:26.967285 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:26.967252 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5"] Apr 22 14:31:26.967692 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:26.967666 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" containerID="cri-o://54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f" gracePeriod=30 Apr 22 14:31:26.967761 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:26.967731 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" containerID="cri-o://ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b" gracePeriod=30 Apr 22 14:31:27.009006 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.008971 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w"] Apr 22 14:31:27.009253 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.009240 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" Apr 22 14:31:27.009296 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.009255 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" Apr 22 14:31:27.009334 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.009319 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="21106ddb-46e3-44d5-8124-34f5001722b5" containerName="model-chainer-raw-hpa-2ea51" Apr 22 14:31:27.012198 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.012181 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:31:27.022475 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.022445 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w"] Apr 22 14:31:27.067891 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.067857 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl"] Apr 22 14:31:27.068139 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.068085 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" podUID="cff0fcbf-3994-4c6d-b36d-4520955398b2" containerName="kserve-container" containerID="cri-o://501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b" gracePeriod=30 Apr 22 14:31:27.136426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.136390 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0430015c-707f-4728-9170-ed9626aaad1b-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w\" (UID: \"0430015c-707f-4728-9170-ed9626aaad1b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:31:27.237395 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.237368 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0430015c-707f-4728-9170-ed9626aaad1b-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w\" (UID: \"0430015c-707f-4728-9170-ed9626aaad1b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:31:27.237706 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.237688 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0430015c-707f-4728-9170-ed9626aaad1b-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w\" (UID: \"0430015c-707f-4728-9170-ed9626aaad1b\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:31:27.299978 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.299956 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" Apr 22 14:31:27.322764 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.322692 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:31:27.397569 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.397543 2566 generic.go:358] "Generic (PLEG): container finished" podID="cff0fcbf-3994-4c6d-b36d-4520955398b2" containerID="501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b" exitCode=2 Apr 22 14:31:27.397688 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.397621 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" event={"ID":"cff0fcbf-3994-4c6d-b36d-4520955398b2","Type":"ContainerDied","Data":"501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b"} Apr 22 14:31:27.397688 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.397635 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" Apr 22 14:31:27.397688 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.397651 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl" event={"ID":"cff0fcbf-3994-4c6d-b36d-4520955398b2","Type":"ContainerDied","Data":"2496e2d113345020d09ef6e6ae323e9c4e4985e630a78ade84dc2fed61273a7d"} Apr 22 14:31:27.397688 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.397671 2566 scope.go:117] "RemoveContainer" containerID="501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b" Apr 22 14:31:27.407002 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.406982 2566 scope.go:117] "RemoveContainer" containerID="501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b" Apr 22 14:31:27.407693 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:31:27.407664 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b\": container with ID starting with 501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b not found: ID does not exist" containerID="501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b" Apr 22 14:31:27.407761 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.407687 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b"} err="failed to get container status \"501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b\": rpc error: code = NotFound desc = could not find container \"501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b\": container with ID starting with 501e333393a423d300d5d1b666e3fb3f57d822385f390132f4b40ec82e57b20b not found: ID does not exist" Apr 22 14:31:27.421652 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.421626 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl"] Apr 22 14:31:27.424661 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.424633 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-8d0fa-predictor-5f649c8785-gh8cl"] Apr 22 14:31:27.440949 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:27.440857 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w"] Apr 22 14:31:27.445643 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:31:27.445615 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0430015c_707f_4728_9170_ed9626aaad1b.slice/crio-74b42024d9f9a2cd456201b1559f5308a34b72c47f7edf5e51c5a03aae78402d WatchSource:0}: Error finding container 74b42024d9f9a2cd456201b1559f5308a34b72c47f7edf5e51c5a03aae78402d: Status 404 returned error can't find the container with id 74b42024d9f9a2cd456201b1559f5308a34b72c47f7edf5e51c5a03aae78402d Apr 22 14:31:28.401386 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:28.401343 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" event={"ID":"0430015c-707f-4728-9170-ed9626aaad1b","Type":"ContainerStarted","Data":"ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b"} Apr 22 14:31:28.401386 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:28.401386 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" event={"ID":"0430015c-707f-4728-9170-ed9626aaad1b","Type":"ContainerStarted","Data":"74b42024d9f9a2cd456201b1559f5308a34b72c47f7edf5e51c5a03aae78402d"} Apr 22 14:31:28.633066 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:28.633024 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff0fcbf-3994-4c6d-b36d-4520955398b2" path="/var/lib/kubelet/pods/cff0fcbf-3994-4c6d-b36d-4520955398b2/volumes" Apr 22 14:31:29.630640 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:29.630596 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:31:29.631076 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:29.630970 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:31:31.412525 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:31.412487 2566 generic.go:358] "Generic (PLEG): container finished" podID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerID="54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f" exitCode=0 Apr 22 14:31:31.412920 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:31.412585 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" event={"ID":"0b34a83e-7a85-410d-94ec-cf0e565f428f","Type":"ContainerDied","Data":"54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f"} Apr 22 14:31:32.416476 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:32.416427 2566 generic.go:358] "Generic (PLEG): container finished" podID="0430015c-707f-4728-9170-ed9626aaad1b" containerID="ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b" exitCode=0 Apr 22 14:31:32.416476 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:32.416466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" event={"ID":"0430015c-707f-4728-9170-ed9626aaad1b","Type":"ContainerDied","Data":"ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b"} Apr 22 14:31:33.421381 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:33.421347 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" event={"ID":"0430015c-707f-4728-9170-ed9626aaad1b","Type":"ContainerStarted","Data":"0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745"} Apr 22 14:31:33.421764 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:33.421656 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:31:33.422883 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:33.422855 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:31:33.436849 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:33.436806 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podStartSLOduration=7.436792865 podStartE2EDuration="7.436792865s" podCreationTimestamp="2026-04-22 14:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:31:33.436458025 +0000 UTC m=+975.415969403" watchObservedRunningTime="2026-04-22 14:31:33.436792865 +0000 UTC m=+975.416304244" Apr 22 14:31:34.425716 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:34.425682 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:31:39.629971 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:39.629871 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:31:39.630367 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:39.630222 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:31:44.426144 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:44.426091 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:31:49.630537 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:49.630485 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 14:31:49.630954 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:49.630638 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:31:49.630954 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:49.630784 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:31:49.630954 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:49.630888 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:31:54.425852 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:54.425808 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:31:57.115118 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.115095 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:31:57.250905 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.250824 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b34a83e-7a85-410d-94ec-cf0e565f428f-kserve-provision-location\") pod \"0b34a83e-7a85-410d-94ec-cf0e565f428f\" (UID: \"0b34a83e-7a85-410d-94ec-cf0e565f428f\") " Apr 22 14:31:57.251130 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.251106 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b34a83e-7a85-410d-94ec-cf0e565f428f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b34a83e-7a85-410d-94ec-cf0e565f428f" (UID: "0b34a83e-7a85-410d-94ec-cf0e565f428f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:57.352245 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.352212 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b34a83e-7a85-410d-94ec-cf0e565f428f-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:31:57.488628 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.488594 2566 generic.go:358] "Generic (PLEG): container finished" podID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerID="ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b" exitCode=137 Apr 22 14:31:57.488784 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.488662 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" event={"ID":"0b34a83e-7a85-410d-94ec-cf0e565f428f","Type":"ContainerDied","Data":"ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b"} Apr 22 14:31:57.488784 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.488676 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" Apr 22 14:31:57.488784 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.488689 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5" event={"ID":"0b34a83e-7a85-410d-94ec-cf0e565f428f","Type":"ContainerDied","Data":"2d33ec71f37ee208ddffb989a3463f4059f8ab02836ffd2aa5ef55afaf259d13"} Apr 22 14:31:57.488784 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.488705 2566 scope.go:117] "RemoveContainer" containerID="ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b" Apr 22 14:31:57.496706 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.496510 2566 scope.go:117] "RemoveContainer" containerID="54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f" Apr 22 14:31:57.503104 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.503087 2566 scope.go:117] "RemoveContainer" containerID="c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b" Apr 22 14:31:57.509590 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.509569 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5"] Apr 22 14:31:57.510040 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.510026 2566 scope.go:117] "RemoveContainer" containerID="ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b" Apr 22 14:31:57.510303 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:31:57.510286 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b\": container with ID starting with ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b not found: ID does not exist" containerID="ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b" Apr 22 14:31:57.510341 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.510313 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b"} err="failed to get container status \"ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b\": rpc error: code = NotFound desc = could not find container \"ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b\": container with ID starting with ed75e404911ee6a3c1ad3201ee463f8524faa7e5a0d6aacf21945b8112b78c3b not found: ID does not exist" Apr 22 14:31:57.510341 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.510331 2566 scope.go:117] "RemoveContainer" containerID="54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f" Apr 22 14:31:57.510580 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:31:57.510563 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f\": container with ID starting with 54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f not found: ID does not exist" containerID="54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f" Apr 22 14:31:57.510640 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.510586 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f"} err="failed to get container status \"54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f\": rpc error: code = NotFound desc = could not find container \"54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f\": container with ID starting with 54ccfa481ed41f536e29de20352f2765d6b98142c339f352e59a11e238bf3d1f not found: ID does not exist" Apr 22 14:31:57.510640 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.510604 2566 scope.go:117] "RemoveContainer" containerID="c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b" Apr 22 14:31:57.510801 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:31:57.510788 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b\": container with ID starting with c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b not found: ID does not exist" containerID="c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b" Apr 22 14:31:57.510838 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.510805 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b"} err="failed to get container status \"c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b\": rpc error: code = NotFound desc = could not find container \"c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b\": container with ID starting with c23cfc2dd3850f7ba62809f154240b090a7418aa3a51802b0f806b7d060ea43b not found: ID does not exist" Apr 22 14:31:57.513091 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:57.513068 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-8d0fa-predictor-57bf898c67-ffsb5"] Apr 22 14:31:58.632589 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:31:58.632541 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" path="/var/lib/kubelet/pods/0b34a83e-7a85-410d-94ec-cf0e565f428f/volumes" Apr 22 14:32:04.425777 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:32:04.425728 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:32:14.426676 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:32:14.426623 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:32:24.426546 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:32:24.426501 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:32:34.425863 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:32:34.425801 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:32:39.629342 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:32:39.629295 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:32:49.629286 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:32:49.629236 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:32:59.630081 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:32:59.630040 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:33:09.629864 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:09.629779 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:33:19.629258 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:19.629215 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:33:29.629775 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:29.629712 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:33:39.630041 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:39.629998 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:33:49.630613 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:49.630584 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:33:57.207357 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.207324 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w"] Apr 22 14:33:57.207827 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.207604 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" containerID="cri-o://0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745" gracePeriod=30 Apr 22 14:33:57.282445 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282402 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc"] Apr 22 14:33:57.282687 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282674 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" Apr 22 14:33:57.282730 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282689 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" Apr 22 14:33:57.282730 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282702 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="storage-initializer" Apr 22 14:33:57.282730 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282708 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="storage-initializer" Apr 22 14:33:57.282730 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282716 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" Apr 22 14:33:57.282730 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282722 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" Apr 22 14:33:57.282730 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282730 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cff0fcbf-3994-4c6d-b36d-4520955398b2" containerName="kserve-container" Apr 22 14:33:57.282901 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282735 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff0fcbf-3994-4c6d-b36d-4520955398b2" containerName="kserve-container" Apr 22 14:33:57.282901 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282775 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="kserve-container" Apr 22 14:33:57.282901 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282783 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b34a83e-7a85-410d-94ec-cf0e565f428f" containerName="agent" Apr 22 14:33:57.282901 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.282793 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="cff0fcbf-3994-4c6d-b36d-4520955398b2" containerName="kserve-container" Apr 22 14:33:57.285611 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.285595 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:33:57.293164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.293138 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc"] Apr 22 14:33:57.379798 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.379763 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f390d180-b8f4-4f2d-b76a-f9178bf4d3ff-kserve-provision-location\") pod \"isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc\" (UID: \"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff\") " pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:33:57.480425 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.480355 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f390d180-b8f4-4f2d-b76a-f9178bf4d3ff-kserve-provision-location\") pod \"isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc\" (UID: \"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff\") " pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:33:57.480742 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.480724 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f390d180-b8f4-4f2d-b76a-f9178bf4d3ff-kserve-provision-location\") pod \"isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc\" (UID: \"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff\") " pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:33:57.596709 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.596666 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:33:57.711143 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.711115 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc"] Apr 22 14:33:57.713123 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:33:57.713092 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf390d180_b8f4_4f2d_b76a_f9178bf4d3ff.slice/crio-a35320c31bdf500fcd7f0d80bfa6ea1d1643ef1d3bd97d84b48c84a13eda52a3 WatchSource:0}: Error finding container a35320c31bdf500fcd7f0d80bfa6ea1d1643ef1d3bd97d84b48c84a13eda52a3: Status 404 returned error can't find the container with id a35320c31bdf500fcd7f0d80bfa6ea1d1643ef1d3bd97d84b48c84a13eda52a3 Apr 22 14:33:57.714880 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.714863 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:33:57.805398 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.805360 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" event={"ID":"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff","Type":"ContainerStarted","Data":"aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd"} Apr 22 14:33:57.805398 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:57.805397 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" event={"ID":"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff","Type":"ContainerStarted","Data":"a35320c31bdf500fcd7f0d80bfa6ea1d1643ef1d3bd97d84b48c84a13eda52a3"} Apr 22 14:33:59.629741 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:33:59.629698 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 14:34:01.817362 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:01.817269 2566 generic.go:358] "Generic (PLEG): container finished" podID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerID="aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd" exitCode=0 Apr 22 14:34:01.817856 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:01.817349 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" event={"ID":"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff","Type":"ContainerDied","Data":"aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd"} Apr 22 14:34:02.821164 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:02.821126 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" event={"ID":"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff","Type":"ContainerStarted","Data":"5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5"} Apr 22 14:34:02.821673 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:02.821512 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:34:02.822960 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:02.822928 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 14:34:02.841381 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:02.841333 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podStartSLOduration=5.841322799 podStartE2EDuration="5.841322799s" podCreationTimestamp="2026-04-22 14:33:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:34:02.840636656 +0000 UTC m=+1124.820148037" watchObservedRunningTime="2026-04-22 14:34:02.841322799 +0000 UTC m=+1124.820834178" Apr 22 14:34:03.824788 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:03.824744 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 14:34:06.440793 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.440769 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:34:06.548172 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.548083 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0430015c-707f-4728-9170-ed9626aaad1b-kserve-provision-location\") pod \"0430015c-707f-4728-9170-ed9626aaad1b\" (UID: \"0430015c-707f-4728-9170-ed9626aaad1b\") " Apr 22 14:34:06.548418 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.548394 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0430015c-707f-4728-9170-ed9626aaad1b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0430015c-707f-4728-9170-ed9626aaad1b" (UID: "0430015c-707f-4728-9170-ed9626aaad1b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:06.648733 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.648697 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0430015c-707f-4728-9170-ed9626aaad1b-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:34:06.835064 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.834984 2566 generic.go:358] "Generic (PLEG): container finished" podID="0430015c-707f-4728-9170-ed9626aaad1b" containerID="0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745" exitCode=0 Apr 22 14:34:06.835064 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.835051 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" Apr 22 14:34:06.835245 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.835065 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" event={"ID":"0430015c-707f-4728-9170-ed9626aaad1b","Type":"ContainerDied","Data":"0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745"} Apr 22 14:34:06.835245 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.835101 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w" event={"ID":"0430015c-707f-4728-9170-ed9626aaad1b","Type":"ContainerDied","Data":"74b42024d9f9a2cd456201b1559f5308a34b72c47f7edf5e51c5a03aae78402d"} Apr 22 14:34:06.835245 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.835118 2566 scope.go:117] "RemoveContainer" containerID="0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745" Apr 22 14:34:06.842853 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.842823 2566 scope.go:117] "RemoveContainer" containerID="ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b" Apr 22 14:34:06.849767 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.849745 2566 scope.go:117] "RemoveContainer" containerID="0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745" Apr 22 14:34:06.850093 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:34:06.850064 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745\": container with ID starting with 0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745 not found: ID does not exist" containerID="0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745" Apr 22 14:34:06.850170 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.850101 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745"} err="failed to get container status \"0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745\": rpc error: code = NotFound desc = could not find container \"0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745\": container with ID starting with 0bfe8ad3216936c90a715ba07c2182028b14ce54fa2fb6564c1932d8bfbef745 not found: ID does not exist" Apr 22 14:34:06.850170 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.850120 2566 scope.go:117] "RemoveContainer" containerID="ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b" Apr 22 14:34:06.850381 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:34:06.850358 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b\": container with ID starting with ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b not found: ID does not exist" containerID="ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b" Apr 22 14:34:06.850516 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.850393 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b"} err="failed to get container status \"ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b\": rpc error: code = NotFound desc = could not find container \"ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b\": container with ID starting with ad0e1e4c416fa94ce69b5b4b9b4032489dbbb4c8133be73ca4c5a8ff8ea0244b not found: ID does not exist" Apr 22 14:34:06.856159 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.856136 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w"] Apr 22 14:34:06.858821 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:06.858800 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0ba4e-predictor-69458d988c-cgc8w"] Apr 22 14:34:08.634824 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:08.634790 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0430015c-707f-4728-9170-ed9626aaad1b" path="/var/lib/kubelet/pods/0430015c-707f-4728-9170-ed9626aaad1b/volumes" Apr 22 14:34:13.825578 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:13.825525 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 14:34:23.825374 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:23.825329 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 14:34:33.825302 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:33.825255 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 14:34:43.824774 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:43.824679 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 14:34:53.825640 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:34:53.825592 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 14:35:03.824935 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:03.824886 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 14:35:13.825626 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:13.825591 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:35:17.409167 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.409136 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j"] Apr 22 14:35:17.409555 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.409390 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" Apr 22 14:35:17.409555 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.409401 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" Apr 22 14:35:17.409555 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.409419 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="storage-initializer" Apr 22 14:35:17.409555 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.409425 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="storage-initializer" Apr 22 14:35:17.409555 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.409494 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0430015c-707f-4728-9170-ed9626aaad1b" containerName="kserve-container" Apr 22 14:35:17.412241 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.412221 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:17.414881 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.414857 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-d7c1cc\"" Apr 22 14:35:17.414994 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.414912 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-d7c1cc-dockercfg-5x9m2\"" Apr 22 14:35:17.416071 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.416050 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 14:35:17.420545 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.420519 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j"] Apr 22 14:35:17.538706 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.538675 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-cabundle-cert\") pod \"isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j\" (UID: \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\") " pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:17.538879 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.538722 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-kserve-provision-location\") pod \"isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j\" (UID: \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\") " pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:17.639944 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.639911 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-cabundle-cert\") pod \"isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j\" (UID: \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\") " pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:17.640139 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.639954 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-kserve-provision-location\") pod \"isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j\" (UID: \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\") " pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:17.640300 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.640282 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-kserve-provision-location\") pod \"isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j\" (UID: \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\") " pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:17.640652 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.640629 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-cabundle-cert\") pod \"isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j\" (UID: \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\") " pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:17.723372 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.723283 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:17.843126 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:17.843100 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j"] Apr 22 14:35:17.845327 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:35:17.845300 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d4b1dd_2bea_4a5c_a3bf_2d271dae167a.slice/crio-a1c0b8eef62b1a2c34671c1ef26903fd9c0ab25544b0249fb06dcb5459a31f34 WatchSource:0}: Error finding container a1c0b8eef62b1a2c34671c1ef26903fd9c0ab25544b0249fb06dcb5459a31f34: Status 404 returned error can't find the container with id a1c0b8eef62b1a2c34671c1ef26903fd9c0ab25544b0249fb06dcb5459a31f34 Apr 22 14:35:18.024290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:18.024195 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" event={"ID":"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a","Type":"ContainerStarted","Data":"22e526722faf361eec1cc6fe4c3a744354c38b348db1fcec99cd27253873942e"} Apr 22 14:35:18.024290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:18.024240 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" event={"ID":"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a","Type":"ContainerStarted","Data":"a1c0b8eef62b1a2c34671c1ef26903fd9c0ab25544b0249fb06dcb5459a31f34"} Apr 22 14:35:18.565936 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:18.565912 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:35:18.567572 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:18.567551 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:35:24.041500 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:24.041469 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a/storage-initializer/0.log" Apr 22 14:35:24.041888 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:24.041515 2566 generic.go:358] "Generic (PLEG): container finished" podID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" containerID="22e526722faf361eec1cc6fe4c3a744354c38b348db1fcec99cd27253873942e" exitCode=1 Apr 22 14:35:24.041888 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:24.041591 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" event={"ID":"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a","Type":"ContainerDied","Data":"22e526722faf361eec1cc6fe4c3a744354c38b348db1fcec99cd27253873942e"} Apr 22 14:35:25.045816 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:25.045790 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a/storage-initializer/0.log" Apr 22 14:35:25.046264 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:25.045845 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" event={"ID":"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a","Type":"ContainerStarted","Data":"f7e4185a023dab052e79f6c40ea7b217b5a9148f1e2fb5fd7d37354d6c3d764f"} Apr 22 14:35:30.061345 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:30.061314 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a/storage-initializer/1.log" Apr 22 14:35:30.061719 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:30.061674 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a/storage-initializer/0.log" Apr 22 14:35:30.061719 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:30.061704 2566 generic.go:358] "Generic (PLEG): container finished" podID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" containerID="f7e4185a023dab052e79f6c40ea7b217b5a9148f1e2fb5fd7d37354d6c3d764f" exitCode=1 Apr 22 14:35:30.061793 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:30.061759 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" event={"ID":"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a","Type":"ContainerDied","Data":"f7e4185a023dab052e79f6c40ea7b217b5a9148f1e2fb5fd7d37354d6c3d764f"} Apr 22 14:35:30.061793 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:30.061790 2566 scope.go:117] "RemoveContainer" containerID="22e526722faf361eec1cc6fe4c3a744354c38b348db1fcec99cd27253873942e" Apr 22 14:35:30.062180 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:30.062164 2566 scope.go:117] "RemoveContainer" containerID="22e526722faf361eec1cc6fe4c3a744354c38b348db1fcec99cd27253873942e" Apr 22 14:35:30.071819 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:35:30.071790 2566 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_kserve-ci-e2e-test_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a_0 in pod sandbox a1c0b8eef62b1a2c34671c1ef26903fd9c0ab25544b0249fb06dcb5459a31f34 from index: no such id: '22e526722faf361eec1cc6fe4c3a744354c38b348db1fcec99cd27253873942e'" containerID="22e526722faf361eec1cc6fe4c3a744354c38b348db1fcec99cd27253873942e" Apr 22 14:35:30.071903 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:35:30.071839 2566 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_kserve-ci-e2e-test_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a_0 in pod sandbox a1c0b8eef62b1a2c34671c1ef26903fd9c0ab25544b0249fb06dcb5459a31f34 from index: no such id: '22e526722faf361eec1cc6fe4c3a744354c38b348db1fcec99cd27253873942e'; Skipping pod \"isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_kserve-ci-e2e-test(58d4b1dd-2bea-4a5c-a3bf-2d271dae167a)\"" logger="UnhandledError" Apr 22 14:35:30.073139 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:35:30.073120 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_kserve-ci-e2e-test(58d4b1dd-2bea-4a5c-a3bf-2d271dae167a)\"" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" podUID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" Apr 22 14:35:31.066483 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:31.066456 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a/storage-initializer/1.log" Apr 22 14:35:35.434953 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.434914 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc"] Apr 22 14:35:35.435400 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.435212 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" containerID="cri-o://5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5" gracePeriod=30 Apr 22 14:35:35.493762 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.493729 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j"] Apr 22 14:35:35.578527 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.578499 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k"] Apr 22 14:35:35.582742 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.582718 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:35.585278 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.585256 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-b37114-dockercfg-sbmvx\"" Apr 22 14:35:35.585391 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.585259 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-b37114\"" Apr 22 14:35:35.591109 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.591084 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k"] Apr 22 14:35:35.628508 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.628486 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a/storage-initializer/1.log" Apr 22 14:35:35.628631 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.628584 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:35.672908 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.672878 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-cabundle-cert\") pod \"isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k\" (UID: \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\") " pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:35.673054 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.672917 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-kserve-provision-location\") pod \"isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k\" (UID: \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\") " pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:35.774134 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774048 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-cabundle-cert\") pod \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\" (UID: \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\") " Apr 22 14:35:35.774134 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774128 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-kserve-provision-location\") pod \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\" (UID: \"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a\") " Apr 22 14:35:35.774345 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774241 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-cabundle-cert\") pod \"isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k\" (UID: \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\") " pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:35.774428 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774395 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" (UID: "58d4b1dd-2bea-4a5c-a3bf-2d271dae167a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:35.774596 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774459 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" (UID: "58d4b1dd-2bea-4a5c-a3bf-2d271dae167a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:35:35.774661 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774422 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-kserve-provision-location\") pod \"isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k\" (UID: \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\") " pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:35.774727 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774711 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-cabundle-cert\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:35:35.774780 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774735 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:35:35.774780 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774715 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-kserve-provision-location\") pod \"isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k\" (UID: \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\") " pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:35.774955 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.774939 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-cabundle-cert\") pod \"isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k\" (UID: \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\") " pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:35.893720 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:35.893681 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:36.006481 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.006458 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k"] Apr 22 14:35:36.008895 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:35:36.008865 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8a7d276_9a15_40a2_975e_bd7d4797e2ae.slice/crio-5b25aa8ff60f36535ccd075ccf8170bb0822716894b7bfd47534bfc7c38cfb41 WatchSource:0}: Error finding container 5b25aa8ff60f36535ccd075ccf8170bb0822716894b7bfd47534bfc7c38cfb41: Status 404 returned error can't find the container with id 5b25aa8ff60f36535ccd075ccf8170bb0822716894b7bfd47534bfc7c38cfb41 Apr 22 14:35:36.080226 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.080194 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" event={"ID":"f8a7d276-9a15-40a2-975e-bd7d4797e2ae","Type":"ContainerStarted","Data":"53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4"} Apr 22 14:35:36.080339 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.080233 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" event={"ID":"f8a7d276-9a15-40a2-975e-bd7d4797e2ae","Type":"ContainerStarted","Data":"5b25aa8ff60f36535ccd075ccf8170bb0822716894b7bfd47534bfc7c38cfb41"} Apr 22 14:35:36.081269 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.081249 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j_58d4b1dd-2bea-4a5c-a3bf-2d271dae167a/storage-initializer/1.log" Apr 22 14:35:36.081371 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.081294 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" event={"ID":"58d4b1dd-2bea-4a5c-a3bf-2d271dae167a","Type":"ContainerDied","Data":"a1c0b8eef62b1a2c34671c1ef26903fd9c0ab25544b0249fb06dcb5459a31f34"} Apr 22 14:35:36.081371 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.081323 2566 scope.go:117] "RemoveContainer" containerID="f7e4185a023dab052e79f6c40ea7b217b5a9148f1e2fb5fd7d37354d6c3d764f" Apr 22 14:35:36.081371 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.081337 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j" Apr 22 14:35:36.120204 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.120171 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j"] Apr 22 14:35:36.123406 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.123376 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-d7c1cc-predictor-85c49f4765-kp75j"] Apr 22 14:35:36.633476 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:36.633427 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" path="/var/lib/kubelet/pods/58d4b1dd-2bea-4a5c-a3bf-2d271dae167a/volumes" Apr 22 14:35:39.776670 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:39.776646 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:35:39.904395 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:39.904293 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f390d180-b8f4-4f2d-b76a-f9178bf4d3ff-kserve-provision-location\") pod \"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff\" (UID: \"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff\") " Apr 22 14:35:39.904749 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:39.904712 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f390d180-b8f4-4f2d-b76a-f9178bf4d3ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" (UID: "f390d180-b8f4-4f2d-b76a-f9178bf4d3ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:40.004945 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.004911 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f390d180-b8f4-4f2d-b76a-f9178bf4d3ff-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:35:40.094422 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.094394 2566 generic.go:358] "Generic (PLEG): container finished" podID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerID="5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5" exitCode=0 Apr 22 14:35:40.094590 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.094474 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" Apr 22 14:35:40.094590 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.094474 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" event={"ID":"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff","Type":"ContainerDied","Data":"5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5"} Apr 22 14:35:40.094590 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.094578 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc" event={"ID":"f390d180-b8f4-4f2d-b76a-f9178bf4d3ff","Type":"ContainerDied","Data":"a35320c31bdf500fcd7f0d80bfa6ea1d1643ef1d3bd97d84b48c84a13eda52a3"} Apr 22 14:35:40.094741 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.094601 2566 scope.go:117] "RemoveContainer" containerID="5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5" Apr 22 14:35:40.101931 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.101863 2566 scope.go:117] "RemoveContainer" containerID="aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd" Apr 22 14:35:40.108499 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.108482 2566 scope.go:117] "RemoveContainer" containerID="5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5" Apr 22 14:35:40.108783 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:35:40.108764 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5\": container with ID starting with 5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5 not found: ID does not exist" containerID="5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5" Apr 22 14:35:40.108859 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.108797 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5"} err="failed to get container status \"5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5\": rpc error: code = NotFound desc = could not find container \"5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5\": container with ID starting with 5f66c15467160f85d3d4ec63ed7a9ab52032f99179f5a104eab48895bf31edf5 not found: ID does not exist" Apr 22 14:35:40.108859 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.108823 2566 scope.go:117] "RemoveContainer" containerID="aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd" Apr 22 14:35:40.109046 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:35:40.109028 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd\": container with ID starting with aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd not found: ID does not exist" containerID="aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd" Apr 22 14:35:40.109088 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.109053 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd"} err="failed to get container status \"aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd\": rpc error: code = NotFound desc = could not find container \"aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd\": container with ID starting with aa9294bc07266d47ef04e70ed506d2dfa6f36a41f11ee643eff20402e7179bbd not found: ID does not exist" Apr 22 14:35:40.114022 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.114002 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc"] Apr 22 14:35:40.118079 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.118054 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-d7c1cc-predictor-85f6c6b6f6-9j4hc"] Apr 22 14:35:40.633139 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:40.633103 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" path="/var/lib/kubelet/pods/f390d180-b8f4-4f2d-b76a-f9178bf4d3ff/volumes" Apr 22 14:35:41.098665 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:41.098642 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_f8a7d276-9a15-40a2-975e-bd7d4797e2ae/storage-initializer/0.log" Apr 22 14:35:41.098980 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:41.098677 2566 generic.go:358] "Generic (PLEG): container finished" podID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" containerID="53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4" exitCode=1 Apr 22 14:35:41.098980 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:41.098708 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" event={"ID":"f8a7d276-9a15-40a2-975e-bd7d4797e2ae","Type":"ContainerDied","Data":"53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4"} Apr 22 14:35:42.103238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:42.103213 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_f8a7d276-9a15-40a2-975e-bd7d4797e2ae/storage-initializer/0.log" Apr 22 14:35:42.103640 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:42.103257 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" event={"ID":"f8a7d276-9a15-40a2-975e-bd7d4797e2ae","Type":"ContainerStarted","Data":"ecb51073207dc448c138a7e5b713280b5aa77098fb56b7d15fed4d9f40bca072"} Apr 22 14:35:45.111955 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.111927 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_f8a7d276-9a15-40a2-975e-bd7d4797e2ae/storage-initializer/1.log" Apr 22 14:35:45.112393 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.112298 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_f8a7d276-9a15-40a2-975e-bd7d4797e2ae/storage-initializer/0.log" Apr 22 14:35:45.112393 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.112338 2566 generic.go:358] "Generic (PLEG): container finished" podID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" containerID="ecb51073207dc448c138a7e5b713280b5aa77098fb56b7d15fed4d9f40bca072" exitCode=1 Apr 22 14:35:45.112393 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.112372 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" event={"ID":"f8a7d276-9a15-40a2-975e-bd7d4797e2ae","Type":"ContainerDied","Data":"ecb51073207dc448c138a7e5b713280b5aa77098fb56b7d15fed4d9f40bca072"} Apr 22 14:35:45.112572 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.112408 2566 scope.go:117] "RemoveContainer" containerID="53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4" Apr 22 14:35:45.112810 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.112793 2566 scope.go:117] "RemoveContainer" containerID="53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4" Apr 22 14:35:45.122092 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:35:45.122064 2566 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_kserve-ci-e2e-test_f8a7d276-9a15-40a2-975e-bd7d4797e2ae_0 in pod sandbox 5b25aa8ff60f36535ccd075ccf8170bb0822716894b7bfd47534bfc7c38cfb41 from index: no such id: '53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4'" containerID="53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4" Apr 22 14:35:45.122156 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.122100 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_kserve-ci-e2e-test_f8a7d276-9a15-40a2-975e-bd7d4797e2ae_0 in pod sandbox 5b25aa8ff60f36535ccd075ccf8170bb0822716894b7bfd47534bfc7c38cfb41 from index: no such id: '53b429218fb31618dd83096532f6ae3ce7c1de4ff325dc77a845883abf1919e4'" Apr 22 14:35:45.122230 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:35:45.122211 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_kserve-ci-e2e-test(f8a7d276-9a15-40a2-975e-bd7d4797e2ae)\"" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" podUID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" Apr 22 14:35:45.583278 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.583202 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k"] Apr 22 14:35:45.694960 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.694929 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4"] Apr 22 14:35:45.695185 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695174 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" containerName="storage-initializer" Apr 22 14:35:45.695243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695187 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" containerName="storage-initializer" Apr 22 14:35:45.695243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695194 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="storage-initializer" Apr 22 14:35:45.695243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695199 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="storage-initializer" Apr 22 14:35:45.695243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695210 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" Apr 22 14:35:45.695243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695215 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" Apr 22 14:35:45.695243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695221 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" containerName="storage-initializer" Apr 22 14:35:45.695243 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695225 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" containerName="storage-initializer" Apr 22 14:35:45.695472 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695263 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" containerName="storage-initializer" Apr 22 14:35:45.695472 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695272 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f390d180-b8f4-4f2d-b76a-f9178bf4d3ff" containerName="kserve-container" Apr 22 14:35:45.695472 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.695278 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="58d4b1dd-2bea-4a5c-a3bf-2d271dae167a" containerName="storage-initializer" Apr 22 14:35:45.699194 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.699173 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:35:45.701800 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.701782 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-x29k5\"" Apr 22 14:35:45.714536 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.714515 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4"] Apr 22 14:35:45.844678 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.844601 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2be530f1-30b6-4733-8d04-01a05b638b74-kserve-provision-location\") pod \"raw-sklearn-77009-predictor-96b4df8df-bw2w4\" (UID: \"2be530f1-30b6-4733-8d04-01a05b638b74\") " pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:35:45.945979 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.945943 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2be530f1-30b6-4733-8d04-01a05b638b74-kserve-provision-location\") pod \"raw-sklearn-77009-predictor-96b4df8df-bw2w4\" (UID: \"2be530f1-30b6-4733-8d04-01a05b638b74\") " pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:35:45.946333 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:45.946309 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2be530f1-30b6-4733-8d04-01a05b638b74-kserve-provision-location\") pod \"raw-sklearn-77009-predictor-96b4df8df-bw2w4\" (UID: \"2be530f1-30b6-4733-8d04-01a05b638b74\") " pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:35:46.009121 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.009090 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:35:46.116060 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.116037 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_f8a7d276-9a15-40a2-975e-bd7d4797e2ae/storage-initializer/1.log" Apr 22 14:35:46.122215 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.122191 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4"] Apr 22 14:35:46.124192 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:35:46.124168 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2be530f1_30b6_4733_8d04_01a05b638b74.slice/crio-01dc6401c26d110a6e8c4fef2e7a9aac5360b3807b682b04fe8790bd89dfdbe0 WatchSource:0}: Error finding container 01dc6401c26d110a6e8c4fef2e7a9aac5360b3807b682b04fe8790bd89dfdbe0: Status 404 returned error can't find the container with id 01dc6401c26d110a6e8c4fef2e7a9aac5360b3807b682b04fe8790bd89dfdbe0 Apr 22 14:35:46.248426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.248400 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_f8a7d276-9a15-40a2-975e-bd7d4797e2ae/storage-initializer/1.log" Apr 22 14:35:46.248574 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.248539 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:46.349093 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.349060 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-cabundle-cert\") pod \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\" (UID: \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\") " Apr 22 14:35:46.349271 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.349107 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-kserve-provision-location\") pod \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\" (UID: \"f8a7d276-9a15-40a2-975e-bd7d4797e2ae\") " Apr 22 14:35:46.349395 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.349371 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8a7d276-9a15-40a2-975e-bd7d4797e2ae" (UID: "f8a7d276-9a15-40a2-975e-bd7d4797e2ae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:35:46.349454 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.349377 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f8a7d276-9a15-40a2-975e-bd7d4797e2ae" (UID: "f8a7d276-9a15-40a2-975e-bd7d4797e2ae"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:35:46.450636 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.450556 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-cabundle-cert\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:35:46.450636 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:46.450587 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8a7d276-9a15-40a2-975e-bd7d4797e2ae-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:35:47.120187 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:47.120143 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" event={"ID":"2be530f1-30b6-4733-8d04-01a05b638b74","Type":"ContainerStarted","Data":"582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9"} Apr 22 14:35:47.120187 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:47.120188 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" event={"ID":"2be530f1-30b6-4733-8d04-01a05b638b74","Type":"ContainerStarted","Data":"01dc6401c26d110a6e8c4fef2e7a9aac5360b3807b682b04fe8790bd89dfdbe0"} Apr 22 14:35:47.121264 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:47.121246 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k_f8a7d276-9a15-40a2-975e-bd7d4797e2ae/storage-initializer/1.log" Apr 22 14:35:47.121368 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:47.121323 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" event={"ID":"f8a7d276-9a15-40a2-975e-bd7d4797e2ae","Type":"ContainerDied","Data":"5b25aa8ff60f36535ccd075ccf8170bb0822716894b7bfd47534bfc7c38cfb41"} Apr 22 14:35:47.121368 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:47.121340 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k" Apr 22 14:35:47.121368 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:47.121356 2566 scope.go:117] "RemoveContainer" containerID="ecb51073207dc448c138a7e5b713280b5aa77098fb56b7d15fed4d9f40bca072" Apr 22 14:35:47.169828 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:47.169795 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k"] Apr 22 14:35:47.177095 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:47.177063 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b37114-predictor-58fbffd4f5-x6r5k"] Apr 22 14:35:48.632926 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:48.632891 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" path="/var/lib/kubelet/pods/f8a7d276-9a15-40a2-975e-bd7d4797e2ae/volumes" Apr 22 14:35:50.131302 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:50.131267 2566 generic.go:358] "Generic (PLEG): container finished" podID="2be530f1-30b6-4733-8d04-01a05b638b74" containerID="582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9" exitCode=0 Apr 22 14:35:50.131679 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:50.131307 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" event={"ID":"2be530f1-30b6-4733-8d04-01a05b638b74","Type":"ContainerDied","Data":"582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9"} Apr 22 14:35:51.135553 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:51.135514 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" event={"ID":"2be530f1-30b6-4733-8d04-01a05b638b74","Type":"ContainerStarted","Data":"04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b"} Apr 22 14:35:51.135955 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:51.135842 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:35:51.137176 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:51.137147 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 14:35:51.151067 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:51.151024 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podStartSLOduration=6.151012332 podStartE2EDuration="6.151012332s" podCreationTimestamp="2026-04-22 14:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:35:51.150273362 +0000 UTC m=+1233.129784740" watchObservedRunningTime="2026-04-22 14:35:51.151012332 +0000 UTC m=+1233.130523709" Apr 22 14:35:52.139049 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:35:52.139012 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 14:36:02.139547 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:36:02.139504 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 14:36:12.141178 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:36:12.141136 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 14:36:22.139171 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:36:22.139124 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 14:36:32.139392 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:36:32.139352 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 14:36:42.139165 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:36:42.139122 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 14:36:52.139603 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:36:52.139557 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 14:37:02.140640 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:02.140606 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:37:05.815053 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.815019 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4"] Apr 22 14:37:05.815459 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.815264 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" containerID="cri-o://04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b" gracePeriod=30 Apr 22 14:37:05.934873 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.934840 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2"] Apr 22 14:37:05.935120 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.935108 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" containerName="storage-initializer" Apr 22 14:37:05.935168 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.935122 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" containerName="storage-initializer" Apr 22 14:37:05.935208 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.935174 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" containerName="storage-initializer" Apr 22 14:37:05.935208 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.935182 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" containerName="storage-initializer" Apr 22 14:37:05.935273 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.935217 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" containerName="storage-initializer" Apr 22 14:37:05.935273 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.935223 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a7d276-9a15-40a2-975e-bd7d4797e2ae" containerName="storage-initializer" Apr 22 14:37:05.938015 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.937996 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:37:05.946698 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.946675 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2"] Apr 22 14:37:05.992985 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:05.992952 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba6cfee-31be-417c-89b2-ceb249d897d1-kserve-provision-location\") pod \"raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2\" (UID: \"dba6cfee-31be-417c-89b2-ceb249d897d1\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:37:06.093498 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:06.093371 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba6cfee-31be-417c-89b2-ceb249d897d1-kserve-provision-location\") pod \"raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2\" (UID: \"dba6cfee-31be-417c-89b2-ceb249d897d1\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:37:06.093783 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:06.093761 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba6cfee-31be-417c-89b2-ceb249d897d1-kserve-provision-location\") pod \"raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2\" (UID: \"dba6cfee-31be-417c-89b2-ceb249d897d1\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:37:06.250565 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:06.250529 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:37:06.366819 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:06.366691 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2"] Apr 22 14:37:06.369426 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:37:06.369400 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddba6cfee_31be_417c_89b2_ceb249d897d1.slice/crio-0495f5567c7cdd942cdfd60acb1dd00c1744b9b60b57631975fbbfe42aea79b6 WatchSource:0}: Error finding container 0495f5567c7cdd942cdfd60acb1dd00c1744b9b60b57631975fbbfe42aea79b6: Status 404 returned error can't find the container with id 0495f5567c7cdd942cdfd60acb1dd00c1744b9b60b57631975fbbfe42aea79b6 Apr 22 14:37:07.342218 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:07.342178 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" event={"ID":"dba6cfee-31be-417c-89b2-ceb249d897d1","Type":"ContainerStarted","Data":"58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed"} Apr 22 14:37:07.342654 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:07.342224 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" event={"ID":"dba6cfee-31be-417c-89b2-ceb249d897d1","Type":"ContainerStarted","Data":"0495f5567c7cdd942cdfd60acb1dd00c1744b9b60b57631975fbbfe42aea79b6"} Apr 22 14:37:10.304421 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.304356 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:37:10.351826 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.351792 2566 generic.go:358] "Generic (PLEG): container finished" podID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerID="58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed" exitCode=0 Apr 22 14:37:10.352009 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.351870 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" event={"ID":"dba6cfee-31be-417c-89b2-ceb249d897d1","Type":"ContainerDied","Data":"58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed"} Apr 22 14:37:10.353247 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.353227 2566 generic.go:358] "Generic (PLEG): container finished" podID="2be530f1-30b6-4733-8d04-01a05b638b74" containerID="04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b" exitCode=0 Apr 22 14:37:10.353345 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.353250 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" event={"ID":"2be530f1-30b6-4733-8d04-01a05b638b74","Type":"ContainerDied","Data":"04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b"} Apr 22 14:37:10.353345 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.353291 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" event={"ID":"2be530f1-30b6-4733-8d04-01a05b638b74","Type":"ContainerDied","Data":"01dc6401c26d110a6e8c4fef2e7a9aac5360b3807b682b04fe8790bd89dfdbe0"} Apr 22 14:37:10.353345 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.353301 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4" Apr 22 14:37:10.353345 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.353310 2566 scope.go:117] "RemoveContainer" containerID="04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b" Apr 22 14:37:10.360342 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.360327 2566 scope.go:117] "RemoveContainer" containerID="582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9" Apr 22 14:37:10.366900 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.366886 2566 scope.go:117] "RemoveContainer" containerID="04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b" Apr 22 14:37:10.367127 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:37:10.367108 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b\": container with ID starting with 04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b not found: ID does not exist" containerID="04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b" Apr 22 14:37:10.367216 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.367132 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b"} err="failed to get container status \"04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b\": rpc error: code = NotFound desc = could not find container \"04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b\": container with ID starting with 04cf969bbf6df1f107607fcba9ab6fec1fbe9f23ec98a3180c32088e025e866b not found: ID does not exist" Apr 22 14:37:10.367216 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.367148 2566 scope.go:117] "RemoveContainer" containerID="582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9" Apr 22 14:37:10.367362 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:37:10.367344 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9\": container with ID starting with 582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9 not found: ID does not exist" containerID="582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9" Apr 22 14:37:10.367398 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.367367 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9"} err="failed to get container status \"582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9\": rpc error: code = NotFound desc = could not find container \"582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9\": container with ID starting with 582f95d33014470036add10026c9f5f8b6db629ebff85175e3217092c2cfccf9 not found: ID does not exist" Apr 22 14:37:10.427142 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.427121 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2be530f1-30b6-4733-8d04-01a05b638b74-kserve-provision-location\") pod \"2be530f1-30b6-4733-8d04-01a05b638b74\" (UID: \"2be530f1-30b6-4733-8d04-01a05b638b74\") " Apr 22 14:37:10.427421 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.427397 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2be530f1-30b6-4733-8d04-01a05b638b74-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2be530f1-30b6-4733-8d04-01a05b638b74" (UID: "2be530f1-30b6-4733-8d04-01a05b638b74"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:37:10.528265 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.528232 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2be530f1-30b6-4733-8d04-01a05b638b74-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:37:10.668395 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.668351 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4"] Apr 22 14:37:10.672391 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:10.672369 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-77009-predictor-96b4df8df-bw2w4"] Apr 22 14:37:11.357143 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:11.357113 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" event={"ID":"dba6cfee-31be-417c-89b2-ceb249d897d1","Type":"ContainerStarted","Data":"5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29"} Apr 22 14:37:11.357564 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:11.357424 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:37:11.358618 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:11.358594 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 14:37:11.374218 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:11.374158 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podStartSLOduration=6.37414105 podStartE2EDuration="6.37414105s" podCreationTimestamp="2026-04-22 14:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:37:11.373836915 +0000 UTC m=+1313.353348284" watchObservedRunningTime="2026-04-22 14:37:11.37414105 +0000 UTC m=+1313.353652421" Apr 22 14:37:12.362158 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:12.362123 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 14:37:12.634532 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:12.634454 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" path="/var/lib/kubelet/pods/2be530f1-30b6-4733-8d04-01a05b638b74/volumes" Apr 22 14:37:22.362792 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:22.362749 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 14:37:32.362698 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:32.362657 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 14:37:42.362421 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:42.362331 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 14:37:52.362275 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:37:52.362223 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 14:38:02.362628 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:02.362576 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 14:38:12.362981 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:12.362939 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 14:38:22.363594 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:22.363558 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:38:25.978815 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:25.978783 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2"] Apr 22 14:38:25.979199 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:25.979037 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" containerID="cri-o://5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29" gracePeriod=30 Apr 22 14:38:30.210830 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.210806 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:38:30.292039 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.291514 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba6cfee-31be-417c-89b2-ceb249d897d1-kserve-provision-location\") pod \"dba6cfee-31be-417c-89b2-ceb249d897d1\" (UID: \"dba6cfee-31be-417c-89b2-ceb249d897d1\") " Apr 22 14:38:30.293518 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.292406 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dba6cfee-31be-417c-89b2-ceb249d897d1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dba6cfee-31be-417c-89b2-ceb249d897d1" (UID: "dba6cfee-31be-417c-89b2-ceb249d897d1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:38:30.392948 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.392908 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dba6cfee-31be-417c-89b2-ceb249d897d1-kserve-provision-location\") on node \"ip-10-0-129-161.ec2.internal\" DevicePath \"\"" Apr 22 14:38:30.568218 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.568133 2566 generic.go:358] "Generic (PLEG): container finished" podID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerID="5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29" exitCode=0 Apr 22 14:38:30.568218 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.568192 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" event={"ID":"dba6cfee-31be-417c-89b2-ceb249d897d1","Type":"ContainerDied","Data":"5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29"} Apr 22 14:38:30.568506 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.568219 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" Apr 22 14:38:30.568506 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.568238 2566 scope.go:117] "RemoveContainer" containerID="5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29" Apr 22 14:38:30.568506 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.568228 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2" event={"ID":"dba6cfee-31be-417c-89b2-ceb249d897d1","Type":"ContainerDied","Data":"0495f5567c7cdd942cdfd60acb1dd00c1744b9b60b57631975fbbfe42aea79b6"} Apr 22 14:38:30.575720 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.575701 2566 scope.go:117] "RemoveContainer" containerID="58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed" Apr 22 14:38:30.585240 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.585215 2566 scope.go:117] "RemoveContainer" containerID="5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29" Apr 22 14:38:30.585547 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:38:30.585528 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29\": container with ID starting with 5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29 not found: ID does not exist" containerID="5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29" Apr 22 14:38:30.585626 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.585576 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29"} err="failed to get container status \"5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29\": rpc error: code = NotFound desc = could not find container \"5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29\": container with ID starting with 5e03194ce4e58aaf5bbabca0d147466edb1b6923191ca08b5f469e6a18b6dc29 not found: ID does not exist" Apr 22 14:38:30.585626 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.585594 2566 scope.go:117] "RemoveContainer" containerID="58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed" Apr 22 14:38:30.585819 ip-10-0-129-161 kubenswrapper[2566]: E0422 14:38:30.585800 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed\": container with ID starting with 58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed not found: ID does not exist" containerID="58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed" Apr 22 14:38:30.585860 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.585825 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed"} err="failed to get container status \"58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed\": rpc error: code = NotFound desc = could not find container \"58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed\": container with ID starting with 58797c89a1596d4883d061dafcec4caa7cc4060aa55ab49f46a49231297448ed not found: ID does not exist" Apr 22 14:38:30.588234 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.588211 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2"] Apr 22 14:38:30.592187 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.592166 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-9da2e-predictor-75cc5f46c-lg5g2"] Apr 22 14:38:30.633680 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:30.633644 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" path="/var/lib/kubelet/pods/dba6cfee-31be-417c-89b2-ceb249d897d1/volumes" Apr 22 14:38:50.515752 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515704 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgl78/must-gather-vsvvj"] Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515937 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515947 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515956 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515962 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515976 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="storage-initializer" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515982 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="storage-initializer" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515992 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="storage-initializer" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.515997 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="storage-initializer" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.516034 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="dba6cfee-31be-417c-89b2-ceb249d897d1" containerName="kserve-container" Apr 22 14:38:50.516238 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.516043 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2be530f1-30b6-4733-8d04-01a05b638b74" containerName="kserve-container" Apr 22 14:38:50.518772 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.518756 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgl78/must-gather-vsvvj" Apr 22 14:38:50.521270 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.521248 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bgl78\"/\"openshift-service-ca.crt\"" Apr 22 14:38:50.522353 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.522336 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bgl78\"/\"kube-root-ca.crt\"" Apr 22 14:38:50.522412 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.522339 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-bgl78\"/\"default-dockercfg-qvmhq\"" Apr 22 14:38:50.525723 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.525701 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgl78/must-gather-vsvvj"] Apr 22 14:38:50.636856 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.636829 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d597be39-72e3-4be6-b5ec-dcc322fef09f-must-gather-output\") pod \"must-gather-vsvvj\" (UID: \"d597be39-72e3-4be6-b5ec-dcc322fef09f\") " pod="openshift-must-gather-bgl78/must-gather-vsvvj" Apr 22 14:38:50.637021 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.636872 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84jw\" (UniqueName: \"kubernetes.io/projected/d597be39-72e3-4be6-b5ec-dcc322fef09f-kube-api-access-x84jw\") pod \"must-gather-vsvvj\" (UID: \"d597be39-72e3-4be6-b5ec-dcc322fef09f\") " pod="openshift-must-gather-bgl78/must-gather-vsvvj" Apr 22 14:38:50.737678 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.737650 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x84jw\" (UniqueName: \"kubernetes.io/projected/d597be39-72e3-4be6-b5ec-dcc322fef09f-kube-api-access-x84jw\") pod \"must-gather-vsvvj\" (UID: \"d597be39-72e3-4be6-b5ec-dcc322fef09f\") " pod="openshift-must-gather-bgl78/must-gather-vsvvj" Apr 22 14:38:50.737847 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.737700 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d597be39-72e3-4be6-b5ec-dcc322fef09f-must-gather-output\") pod \"must-gather-vsvvj\" (UID: \"d597be39-72e3-4be6-b5ec-dcc322fef09f\") " pod="openshift-must-gather-bgl78/must-gather-vsvvj" Apr 22 14:38:50.737985 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.737970 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d597be39-72e3-4be6-b5ec-dcc322fef09f-must-gather-output\") pod \"must-gather-vsvvj\" (UID: \"d597be39-72e3-4be6-b5ec-dcc322fef09f\") " pod="openshift-must-gather-bgl78/must-gather-vsvvj" Apr 22 14:38:50.745567 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.745537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84jw\" (UniqueName: \"kubernetes.io/projected/d597be39-72e3-4be6-b5ec-dcc322fef09f-kube-api-access-x84jw\") pod \"must-gather-vsvvj\" (UID: \"d597be39-72e3-4be6-b5ec-dcc322fef09f\") " pod="openshift-must-gather-bgl78/must-gather-vsvvj" Apr 22 14:38:50.828237 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.828155 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgl78/must-gather-vsvvj" Apr 22 14:38:50.940868 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:50.940846 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgl78/must-gather-vsvvj"] Apr 22 14:38:50.943239 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:38:50.943216 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd597be39_72e3_4be6_b5ec_dcc322fef09f.slice/crio-48f797a066d945d8a57913a76e622c934827c9f7aa19e523157c63ef42acc1ef WatchSource:0}: Error finding container 48f797a066d945d8a57913a76e622c934827c9f7aa19e523157c63ef42acc1ef: Status 404 returned error can't find the container with id 48f797a066d945d8a57913a76e622c934827c9f7aa19e523157c63ef42acc1ef Apr 22 14:38:51.626008 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:51.625972 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/must-gather-vsvvj" event={"ID":"d597be39-72e3-4be6-b5ec-dcc322fef09f","Type":"ContainerStarted","Data":"48f797a066d945d8a57913a76e622c934827c9f7aa19e523157c63ef42acc1ef"} Apr 22 14:38:52.632223 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:52.632193 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/must-gather-vsvvj" event={"ID":"d597be39-72e3-4be6-b5ec-dcc322fef09f","Type":"ContainerStarted","Data":"a408e3381c2eee889de34e640d6663c1e5d77f5ae5ba6cc89f7720872cd543c4"} Apr 22 14:38:52.632223 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:52.632227 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/must-gather-vsvvj" event={"ID":"d597be39-72e3-4be6-b5ec-dcc322fef09f","Type":"ContainerStarted","Data":"bd1ca0e19c1d8d58d20f6603072f22ecc8acbd6e86274688138e2a8128e60268"} Apr 22 14:38:52.645259 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:52.645211 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bgl78/must-gather-vsvvj" podStartSLOduration=1.668884088 podStartE2EDuration="2.645196402s" podCreationTimestamp="2026-04-22 14:38:50 +0000 UTC" firstStartedPulling="2026-04-22 14:38:50.944979963 +0000 UTC m=+1412.924491320" lastFinishedPulling="2026-04-22 14:38:51.921292274 +0000 UTC m=+1413.900803634" observedRunningTime="2026-04-22 14:38:52.644620843 +0000 UTC m=+1414.624132220" watchObservedRunningTime="2026-04-22 14:38:52.645196402 +0000 UTC m=+1414.624707781" Apr 22 14:38:53.269409 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:53.269374 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dghbg_a3986444-d7dd-4409-b107-157fc81b5e02/global-pull-secret-syncer/0.log" Apr 22 14:38:53.367515 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:53.367483 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rnwmp_6a11fb65-b996-42de-a115-49420effa19b/konnectivity-agent/0.log" Apr 22 14:38:53.441724 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:53.441693 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-161.ec2.internal_2a7add03423232b9ddbd2ad4e8a3d9c3/haproxy/0.log" Apr 22 14:38:57.309309 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:57.309280 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sg67_7a7698ea-e37f-4471-b0da-79820ee5c2ef/node-exporter/0.log" Apr 22 14:38:57.326673 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:57.326648 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sg67_7a7698ea-e37f-4471-b0da-79820ee5c2ef/kube-rbac-proxy/0.log" Apr 22 14:38:57.345113 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:38:57.345088 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4sg67_7a7698ea-e37f-4471-b0da-79820ee5c2ef/init-textfile/0.log" Apr 22 14:39:00.390385 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.390354 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7"] Apr 22 14:39:00.393254 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.393237 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.403510 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.403483 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7"] Apr 22 14:39:00.521226 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.521182 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-lib-modules\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.521385 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.521239 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-podres\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.521385 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.521294 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-proc\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.521385 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.521313 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2q6b\" (UniqueName: \"kubernetes.io/projected/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-kube-api-access-b2q6b\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.521385 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.521343 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-sys\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.622482 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622448 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-lib-modules\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.622654 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622492 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-podres\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.622654 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622519 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-proc\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.622654 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2q6b\" (UniqueName: \"kubernetes.io/projected/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-kube-api-access-b2q6b\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.622654 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622567 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-sys\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.622654 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622576 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-lib-modules\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.622654 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622629 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-proc\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.622654 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622642 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-podres\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.623025 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.622644 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-sys\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.633345 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.633321 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2q6b\" (UniqueName: \"kubernetes.io/projected/7cd3fb9b-3659-4a3f-9f6c-30b5e84af015-kube-api-access-b2q6b\") pod \"perf-node-gather-daemonset-82mb7\" (UID: \"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.704203 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.704123 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:00.841183 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.841159 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7"] Apr 22 14:39:00.845142 ip-10-0-129-161 kubenswrapper[2566]: W0422 14:39:00.845108 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7cd3fb9b_3659_4a3f_9f6c_30b5e84af015.slice/crio-18a026586a7f84d8667bab44dd5001378bd089cf1fa11913114f3857f0fe6dab WatchSource:0}: Error finding container 18a026586a7f84d8667bab44dd5001378bd089cf1fa11913114f3857f0fe6dab: Status 404 returned error can't find the container with id 18a026586a7f84d8667bab44dd5001378bd089cf1fa11913114f3857f0fe6dab Apr 22 14:39:00.846523 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.846505 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:39:00.904229 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.904208 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-22qt9_d568b082-5592-4166-8e56-4b3f5d03022f/dns/0.log" Apr 22 14:39:00.922471 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:00.922427 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-22qt9_d568b082-5592-4166-8e56-4b3f5d03022f/kube-rbac-proxy/0.log" Apr 22 14:39:01.028572 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:01.028492 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6rsgf_033bc69f-f51f-40ca-8484-1ae2dc580b53/dns-node-resolver/0.log" Apr 22 14:39:01.495720 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:01.495694 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zmgl9_1c429a6e-0682-4fe6-9ec0-b39e350ccc63/node-ca/0.log" Apr 22 14:39:01.663705 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:01.662914 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" event={"ID":"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015","Type":"ContainerStarted","Data":"0c4035aeeb690803468d45b68a94aadf7541c2f7d68ce0dda994e37b2f387ccd"} Apr 22 14:39:01.663705 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:01.662952 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" event={"ID":"7cd3fb9b-3659-4a3f-9f6c-30b5e84af015","Type":"ContainerStarted","Data":"18a026586a7f84d8667bab44dd5001378bd089cf1fa11913114f3857f0fe6dab"} Apr 22 14:39:01.663705 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:01.663673 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:01.680290 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:01.680244 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" podStartSLOduration=1.680227058 podStartE2EDuration="1.680227058s" podCreationTimestamp="2026-04-22 14:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:39:01.678459562 +0000 UTC m=+1423.657970938" watchObservedRunningTime="2026-04-22 14:39:01.680227058 +0000 UTC m=+1423.659738437" Apr 22 14:39:02.433872 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:02.433842 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-f4zzk_3da3d9ed-4865-4d4d-a429-13417afb99df/serve-healthcheck-canary/0.log" Apr 22 14:39:02.861038 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:02.860995 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ctlbm_93431b3f-da56-4a76-93a2-98e25d7809a5/kube-rbac-proxy/0.log" Apr 22 14:39:02.876684 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:02.876654 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ctlbm_93431b3f-da56-4a76-93a2-98e25d7809a5/exporter/0.log" Apr 22 14:39:02.893619 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:02.893575 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ctlbm_93431b3f-da56-4a76-93a2-98e25d7809a5/extractor/0.log" Apr 22 14:39:04.950958 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:04.950930 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-ggqbz_fd025db6-f28c-4a4c-8279-22e1a33a50ca/server/0.log" Apr 22 14:39:05.094391 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:05.094348 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-k5vb4_01933426-0fec-43f9-bb14-c5367060ee88/manager/0.log" Apr 22 14:39:05.135485 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:05.135445 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-c8tlw_1ee7ffd2-470a-44c8-81a5-65ad3bea8e63/seaweedfs/0.log" Apr 22 14:39:08.679036 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:08.678966 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-82mb7" Apr 22 14:39:09.775679 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:09.775653 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ng46h_57899f5d-95c9-4f88-8a37-538507647859/kube-multus-additional-cni-plugins/0.log" Apr 22 14:39:09.792237 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:09.792205 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ng46h_57899f5d-95c9-4f88-8a37-538507647859/egress-router-binary-copy/0.log" Apr 22 14:39:09.812025 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:09.811991 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ng46h_57899f5d-95c9-4f88-8a37-538507647859/cni-plugins/0.log" Apr 22 14:39:09.830399 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:09.830368 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ng46h_57899f5d-95c9-4f88-8a37-538507647859/bond-cni-plugin/0.log" Apr 22 14:39:09.850835 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:09.850802 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ng46h_57899f5d-95c9-4f88-8a37-538507647859/routeoverride-cni/0.log" Apr 22 14:39:09.868810 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:09.868731 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ng46h_57899f5d-95c9-4f88-8a37-538507647859/whereabouts-cni-bincopy/0.log" Apr 22 14:39:09.886531 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:09.886500 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ng46h_57899f5d-95c9-4f88-8a37-538507647859/whereabouts-cni/0.log" Apr 22 14:39:10.270884 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:10.270851 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwk64_1e1454ff-e291-42e9-8bb6-cd922139fd02/kube-multus/0.log" Apr 22 14:39:10.298529 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:10.298497 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9rgrl_ff0fda3b-a631-4479-bca1-451b3fd7ac2f/network-metrics-daemon/0.log" Apr 22 14:39:10.316752 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:10.316722 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9rgrl_ff0fda3b-a631-4479-bca1-451b3fd7ac2f/kube-rbac-proxy/0.log" Apr 22 14:39:11.307532 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.307185 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-controller/0.log" Apr 22 14:39:11.327497 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.327468 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/0.log" Apr 22 14:39:11.341115 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.341076 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovn-acl-logging/1.log" Apr 22 14:39:11.362488 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.362456 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/kube-rbac-proxy-node/0.log" Apr 22 14:39:11.382610 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.382587 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 14:39:11.398167 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.398130 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/northd/0.log" Apr 22 14:39:11.414404 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.414381 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/nbdb/0.log" Apr 22 14:39:11.430583 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.430558 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/sbdb/0.log" Apr 22 14:39:11.612426 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:11.612394 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-47psb_d37f6164-ab7b-4939-a74e-19ab726827bb/ovnkube-controller/0.log" Apr 22 14:39:13.167831 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:13.167794 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-w9djl_ab2a1f01-aab3-488d-8a5c-09e7a9568954/network-check-target-container/0.log" Apr 22 14:39:13.964989 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:13.964959 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-l8hc9_bccc745a-d0a3-4d47-bb03-7502b82f4a26/iptables-alerter/0.log" Apr 22 14:39:14.652311 ip-10-0-129-161 kubenswrapper[2566]: I0422 14:39:14.652278 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4l7kw_c9e3f13d-f48a-4ce7-a59d-16c11e660545/tuned/0.log"