Apr 22 19:58:13.910423 ip-10-0-141-46 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:58:14.424981 ip-10-0-141-46 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:14.424981 ip-10-0-141-46 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:58:14.424981 ip-10-0-141-46 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:14.424981 ip-10-0-141-46 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:58:14.424981 ip-10-0-141-46 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:14.427864 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.427782 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:58:14.433926 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433910 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:14.433926 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433926 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433930 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433933 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433936 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433939 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433942 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433944 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433947 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433950 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433953 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433955 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433958 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433960 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433963 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433965 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433968 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433970 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433973 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433976 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:14.433990 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433979 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433982 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433986 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433990 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433993 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433996 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.433998 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434001 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434004 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434007 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434009 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434012 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434015 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434017 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434020 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434022 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434025 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434027 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434030 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:14.434460 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434032 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434035 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434037 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434039 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434042 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434044 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434048 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434050 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434053 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434056 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434058 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434060 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434063 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434065 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434068 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434071 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434073 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434076 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434078 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434081 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:14.434993 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434083 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434086 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434088 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434091 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434093 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434095 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434099 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434103 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434106 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434108 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434111 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434113 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434115 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434119 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434121 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434124 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434126 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434129 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434131 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434134 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:14.435496 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434151 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434154 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434156 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434160 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434163 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434166 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434168 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434525 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434530 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434532 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434535 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434538 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434540 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434542 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434545 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434548 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434550 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434552 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434555 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434557 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:14.436002 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434560 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434563 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434565 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434568 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434570 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434573 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434576 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434578 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434581 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434583 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434586 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434588 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434591 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434594 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434596 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434599 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434601 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434604 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434606 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434610 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:14.436492 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434613 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434615 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434618 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434621 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434623 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434626 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434628 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434631 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434634 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434637 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434640 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434643 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434645 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434648 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434650 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434653 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434655 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434659 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434663 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:14.436981 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434666 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434669 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434671 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434674 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434677 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434680 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434682 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434684 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434687 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434690 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434693 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434695 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434698 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434700 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434703 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434706 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434708 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434711 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434713 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434716 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:14.437464 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434718 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434721 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434723 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434726 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434728 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434731 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434734 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434736 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434738 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434741 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434743 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434747 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434750 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.434752 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435289 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435302 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435308 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435313 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435317 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435320 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435324 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:58:14.437947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435328 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435331 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435334 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435338 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435342 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435345 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435348 2579 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435351 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435353 2579 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435356 2579 flags.go:64] FLAG: --cloud-config="" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435359 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435362 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435367 2579 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435370 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435373 2579 flags.go:64] FLAG: --config-dir="" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435376 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435379 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435383 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435386 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435389 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435392 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435395 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435398 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435400 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435404 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:58:14.438476 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435407 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435411 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435414 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435417 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435420 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435423 2579 flags.go:64] FLAG: --enable-server="true" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435426 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435430 2579 flags.go:64] FLAG: --event-burst="100" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435433 2579 flags.go:64] FLAG: --event-qps="50" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435436 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435439 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435442 2579 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435446 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435451 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435457 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435461 2579 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435464 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435467 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435469 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435472 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435475 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435478 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435480 2579 flags.go:64] FLAG: --feature-gates="" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435484 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435487 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:58:14.439060 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435490 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435492 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435495 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435498 2579 flags.go:64] FLAG: --help="false" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435501 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435505 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435508 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435511 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435514 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435518 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435521 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435523 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435526 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435529 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435532 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435535 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435537 2579 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435540 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435543 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435546 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435550 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435553 2579 flags.go:64] FLAG: --lock-file="" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435556 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435559 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:58:14.439668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435562 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435567 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435569 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435572 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435575 2579 flags.go:64] FLAG: --logging-format="text" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435578 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435581 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435584 2579 flags.go:64] FLAG: --manifest-url="" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435587 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435591 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435594 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435598 2579 flags.go:64] FLAG: --max-pods="110" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435601 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435604 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435606 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435609 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435612 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435615 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435618 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435625 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435628 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435631 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435634 2579 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:58:14.440259 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435637 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435642 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435645 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435649 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435651 2579 flags.go:64] FLAG: --port="10250" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435655 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435658 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cd5ac4b13466a6d6" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435661 2579 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435664 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435667 2579 flags.go:64] FLAG: --register-node="true" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435670 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435673 2579 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435676 2579 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435679 2579 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435681 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435684 2579 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435688 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435690 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435693 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435696 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435699 2579 flags.go:64] FLAG: --runonce="false" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435701 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435704 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435707 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435710 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435712 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:58:14.440842 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435716 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435719 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435722 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435725 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435727 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435730 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435732 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435735 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435738 2579 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435744 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435750 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435757 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435759 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435764 2579 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435766 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435769 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435772 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435775 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435778 2579 flags.go:64] FLAG: --v="2" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435782 2579 flags.go:64] FLAG: --version="false" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435786 2579 flags.go:64] FLAG: --vmodule="" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435790 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.435793 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435886 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435889 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:14.441474 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435892 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435896 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435900 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435903 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435906 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435908 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435911 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435914 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435916 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435919 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435922 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435924 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435927 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435929 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435932 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435934 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435937 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435940 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435944 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435946 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:14.442077 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435949 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435951 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435954 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435956 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435959 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435961 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435963 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435966 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435968 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435971 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435973 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435976 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435978 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435980 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435983 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435985 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435988 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435990 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435992 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435995 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:14.442595 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.435997 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436000 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436002 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436005 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436008 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436011 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436013 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436016 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436019 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436022 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436025 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436028 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436030 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436033 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436035 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436037 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436040 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436044 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436047 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:14.443070 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436049 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436052 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436055 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436057 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436060 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436063 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436065 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436068 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436070 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436073 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436075 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436078 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436080 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436083 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436085 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436088 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436091 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436093 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436096 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436098 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:14.443548 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436100 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436104 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436106 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436109 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.436112 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.436635 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.443747 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.443762 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443809 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443814 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443818 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443821 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443823 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443826 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443830 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443833 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:14.444067 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443836 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443839 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443842 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443844 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443847 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443850 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443853 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443855 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443858 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443861 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443865 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443869 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443872 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443875 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443878 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443881 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443884 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443886 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443889 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:14.444487 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443891 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443895 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443897 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443900 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443903 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443905 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443908 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443911 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443914 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443916 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443919 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443921 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443924 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443927 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443930 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443932 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443935 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443937 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443940 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443942 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:14.444942 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443944 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443947 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443949 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443951 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443954 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443957 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443959 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443962 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443964 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443966 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443969 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443971 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443974 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443976 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443979 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443981 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443984 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443987 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443989 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:14.445497 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443992 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.443996 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444000 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444003 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444006 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444009 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444011 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444014 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444016 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444019 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444021 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444023 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444026 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444028 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444030 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444033 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444035 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444037 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444040 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:14.445965 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444044 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.444049 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444160 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444165 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444168 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444171 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444174 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444176 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444179 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444182 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444184 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444187 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444190 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444193 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444195 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444198 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:14.446442 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444200 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444203 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444205 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444208 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444210 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444213 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444215 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444218 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444220 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444223 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444225 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444228 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444230 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444233 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444235 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444238 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444240 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444243 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444245 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:14.446843 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444248 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444250 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444253 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444255 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444257 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444260 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444262 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444265 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444268 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444270 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444273 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444275 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444278 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444280 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444283 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444285 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444288 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444290 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444293 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444295 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:14.447309 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444298 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444300 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444303 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444306 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444308 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444310 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444313 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444315 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444317 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444320 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444322 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444326 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444328 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444331 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444333 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444336 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444338 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444341 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444343 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444346 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:14.447800 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444348 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444351 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444353 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444356 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444358 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444360 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444364 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444368 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444372 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444375 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444378 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444380 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:14.444383 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.444388 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:14.448300 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.444963 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:58:14.448635 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.446881 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:58:14.448635 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.447752 2579 server.go:1019] "Starting client certificate rotation" Apr 22 19:58:14.448635 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.447853 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:58:14.448714 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.448634 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:58:14.469822 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.469803 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:58:14.473337 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.473315 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:58:14.493010 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.492987 2579 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:58:14.499977 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.499959 2579 log.go:25] "Validated CRI v1 image API" Apr 22 19:58:14.502824 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.502809 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:58:14.505113 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.505096 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:58:14.506201 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.506183 2579 fs.go:135] Filesystem UUIDs: map[2a95e973-fa60-478b-92bd-2384dbac8e8e:/dev/nvme0n1p3 4ed96df1-d629-4dc4-a769-674af336eb53:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 19:58:14.506250 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.506202 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:58:14.510967 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.510863 2579 manager.go:217] Machine: {Timestamp:2026-04-22 19:58:14.509858596 +0000 UTC m=+0.464221390 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105660 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2567af3549a3b73e96d1116ec833c6 SystemUUID:ec2567af-3549-a3b7-3e96-d1116ec833c6 BootID:28275045-583a-4c7c-a8d4-bbd93eaf267c Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c0:9e:df:7b:89 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c0:9e:df:7b:89 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:04:6a:e1:4e:c0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:58:14.510967 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.510961 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:58:14.511055 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.511030 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:58:14.511337 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.511315 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:58:14.511467 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.511340 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-46.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:58:14.511513 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.511476 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:58:14.511513 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.511485 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:58:14.511513 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.511497 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:58:14.512665 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.512655 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:58:14.513507 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.513497 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:58:14.513611 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.513602 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:58:14.516199 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.516190 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:58:14.516231 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.516208 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:58:14.516231 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.516221 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:58:14.516231 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.516229 2579 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:58:14.516316 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.516238 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:58:14.517192 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.517180 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:58:14.517244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.517198 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:58:14.519894 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.519878 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:58:14.521261 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.521245 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:58:14.522851 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522829 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:58:14.522886 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522862 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:58:14.522886 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522870 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:58:14.522886 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522877 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:58:14.522886 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522884 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:58:14.522993 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522892 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:58:14.522993 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522898 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:58:14.522993 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522904 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:58:14.522993 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522913 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:58:14.522993 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522920 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:58:14.522993 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522943 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:58:14.522993 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.522953 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:58:14.523768 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.523759 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:58:14.523802 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.523769 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:58:14.527305 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.527177 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-46.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:58:14.527305 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.527214 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:58:14.527437 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.527277 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:58:14.527437 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.527368 2579 server.go:1295] "Started kubelet" Apr 22 19:58:14.527437 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.527403 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-46.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:58:14.530447 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.527469 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:58:14.530447 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.530185 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:58:14.530689 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.530671 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:58:14.530828 ip-10-0-141-46 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:58:14.532194 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.532133 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:58:14.532895 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.532875 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:58:14.537320 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.536450 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-46.ec2.internal.18a8c6202efe13b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-46.ec2.internal,UID:ip-10-0-141-46.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-46.ec2.internal,},FirstTimestamp:2026-04-22 19:58:14.527316914 +0000 UTC m=+0.481679706,LastTimestamp:2026-04-22 19:58:14.527316914 +0000 UTC m=+0.481679706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-46.ec2.internal,}" Apr 22 19:58:14.538569 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.538552 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:58:14.538649 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.538597 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:58:14.539264 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.539245 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:58:14.539497 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.539483 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:58:14.539664 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.539632 2579 factory.go:55] Registering systemd factory Apr 22 19:58:14.539664 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.539659 2579 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:58:14.539824 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.539399 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:14.539824 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.539541 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:58:14.539824 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.539413 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:58:14.539824 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.539766 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:58:14.539824 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.539776 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:58:14.540097 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.540081 2579 factory.go:153] Registering CRI-O factory Apr 22 19:58:14.540097 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.540100 2579 factory.go:223] Registration of the crio container factory successfully Apr 22 19:58:14.540199 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.540166 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:58:14.540199 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.540186 2579 factory.go:103] Registering Raw factory Apr 22 19:58:14.540274 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.540200 2579 manager.go:1196] Started watching for new ooms in manager Apr 22 19:58:14.540689 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.540672 2579 manager.go:319] Starting recovery of all containers Apr 22 19:58:14.544751 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.544723 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:58:14.544959 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.544934 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-46.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:58:14.551026 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.551012 2579 manager.go:324] Recovery completed Apr 22 19:58:14.551884 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.551851 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7sb4c" Apr 22 19:58:14.555644 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.555632 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:14.560668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.560652 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7sb4c" Apr 22 19:58:14.561986 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.561964 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:14.562061 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.562001 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:14.562061 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.562018 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:14.562659 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.562640 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:58:14.562737 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.562659 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:58:14.562737 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.562679 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:58:14.564084 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.564001 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-46.ec2.internal.18a8c620310f1ab1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-46.ec2.internal,UID:ip-10-0-141-46.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-46.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-46.ec2.internal,},FirstTimestamp:2026-04-22 19:58:14.561987249 +0000 UTC m=+0.516350047,LastTimestamp:2026-04-22 19:58:14.561987249 +0000 UTC m=+0.516350047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-46.ec2.internal,}" Apr 22 19:58:14.565594 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.565581 2579 policy_none.go:49] "None policy: Start" Apr 22 19:58:14.565644 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.565605 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:58:14.565644 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.565616 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:58:14.624679 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.624664 2579 manager.go:341] "Starting Device Plugin manager" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.624726 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.624739 2579 server.go:85] "Starting device plugin registration server" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.624965 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.624976 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.625079 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.625204 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.625211 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.625696 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:58:14.625820 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.625736 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:14.686934 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.686884 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:58:14.688231 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.688214 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:58:14.688320 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.688243 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:58:14.688320 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.688263 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:58:14.688320 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.688271 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:58:14.688320 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.688307 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:58:14.690526 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.690508 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:14.725458 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.725429 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:14.726426 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.726411 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:14.726493 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.726439 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:14.726493 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.726449 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:14.726493 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.726470 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.734777 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.734763 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.734821 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.734785 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-46.ec2.internal\": node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:14.761829 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.761808 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:14.788776 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.788754 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal"] Apr 22 19:58:14.788849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.788839 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:14.789681 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.789659 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:14.789757 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.789693 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:14.789757 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.789718 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:14.790884 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.790871 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:14.791018 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.791005 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.791059 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.791039 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:14.791564 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.791541 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:14.791655 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.791572 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:14.791655 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.791582 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:14.791655 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.791548 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:14.791655 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.791641 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:14.791796 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.791657 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:14.792745 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.792732 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.792796 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.792755 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:14.794050 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.794035 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:14.794125 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.794063 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:14.794125 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.794079 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:14.817736 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.817710 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-46.ec2.internal\" not found" node="ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.821837 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.821819 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-46.ec2.internal\" not found" node="ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.840850 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.840829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.840929 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.840856 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.840929 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.840872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6551c028e13aff466c38397d8a508ac4-config\") pod \"kube-apiserver-proxy-ip-10-0-141-46.ec2.internal\" (UID: \"6551c028e13aff466c38397d8a508ac4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.862407 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.862390 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:14.941209 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.941169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.941209 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.941193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.941335 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.941211 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6551c028e13aff466c38397d8a508ac4-config\") pod \"kube-apiserver-proxy-ip-10-0-141-46.ec2.internal\" (UID: \"6551c028e13aff466c38397d8a508ac4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.941335 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.941271 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.941335 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.941284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.941335 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:14.941324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6551c028e13aff466c38397d8a508ac4-config\") pod \"kube-apiserver-proxy-ip-10-0-141-46.ec2.internal\" (UID: \"6551c028e13aff466c38397d8a508ac4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 22 19:58:14.963272 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:14.963252 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.064005 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.063978 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.119110 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.119086 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:15.124625 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.124610 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 22 19:58:15.164784 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.164759 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.265346 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.265259 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.365812 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.365779 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.447457 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.447434 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:58:15.447998 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.447570 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:58:15.466586 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.466561 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.539377 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.539327 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:58:15.549912 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.549888 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:58:15.564564 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.564537 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:53:14 +0000 UTC" deadline="2028-01-27 01:54:14.62508726 +0000 UTC" Apr 22 19:58:15.564564 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.564563 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15461h55m59.060527172s" Apr 22 19:58:15.566637 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.566612 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.572929 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.572908 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dwd8x" Apr 22 19:58:15.580752 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.580734 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dwd8x" Apr 22 19:58:15.634912 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:15.634876 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6551c028e13aff466c38397d8a508ac4.slice/crio-1cda46b49efac438d4da17a3d5b4b2f130f48b63209e2aaa32d10d995974040a WatchSource:0}: Error finding container 1cda46b49efac438d4da17a3d5b4b2f130f48b63209e2aaa32d10d995974040a: Status 404 returned error can't find the container with id 1cda46b49efac438d4da17a3d5b4b2f130f48b63209e2aaa32d10d995974040a Apr 22 19:58:15.635484 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:15.635463 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea9c91cb8023c6edf0f87546014505a5.slice/crio-b05a108698cb79254a4d709e961e1a3b6e5017b2b64baf3caa1482d4f423e3e1 WatchSource:0}: Error finding container b05a108698cb79254a4d709e961e1a3b6e5017b2b64baf3caa1482d4f423e3e1: Status 404 returned error can't find the container with id b05a108698cb79254a4d709e961e1a3b6e5017b2b64baf3caa1482d4f423e3e1 Apr 22 19:58:15.639487 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.639472 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:58:15.667009 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.666989 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.690722 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.690677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" event={"ID":"ea9c91cb8023c6edf0f87546014505a5","Type":"ContainerStarted","Data":"b05a108698cb79254a4d709e961e1a3b6e5017b2b64baf3caa1482d4f423e3e1"} Apr 22 19:58:15.691606 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.691588 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" event={"ID":"6551c028e13aff466c38397d8a508ac4","Type":"ContainerStarted","Data":"1cda46b49efac438d4da17a3d5b4b2f130f48b63209e2aaa32d10d995974040a"} Apr 22 19:58:15.768076 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.768056 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.865588 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.865538 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:15.868188 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.868176 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.968732 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:15.968696 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 22 19:58:15.982875 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:15.982855 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:16.039941 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.039914 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 22 19:58:16.052417 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.052389 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:58:16.054695 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.054652 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 22 19:58:16.060826 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.060809 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:58:16.119944 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.119917 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:16.516975 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.516892 2579 apiserver.go:52] "Watching apiserver" Apr 22 19:58:16.523000 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.522971 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:58:16.523840 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.523819 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:16.523942 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.523884 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vk6km","openshift-image-registry/node-ca-xzq74","openshift-network-diagnostics/network-check-target-64jvm","openshift-network-operator/iptables-alerter-qqmqh","kube-system/global-pull-secret-syncer-vlm4l","kube-system/konnectivity-agent-4jvwg","openshift-cluster-node-tuning-operator/tuned-lt5hj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal","openshift-multus/multus-5dh6n","openshift-multus/multus-additional-cni-plugins-86nst","openshift-multus/network-metrics-daemon-p2gjc","openshift-ovn-kubernetes/ovnkube-node-cw48v","kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh"] Apr 22 19:58:16.525922 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.525902 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.527460 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.527439 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.528574 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.528554 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.529716 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.529528 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.530291 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.530260 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.530633 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.530614 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.531782 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.531762 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:16.533006 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.532990 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.534453 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.534433 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:16.534547 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.534505 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:16.535743 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.535714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.535832 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.535786 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:16.536216 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536198 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.536301 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536247 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.536301 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536291 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.536434 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536203 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:58:16.536551 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536535 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fg666\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536684 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536815 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hpcg5\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536876 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fxvrl\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536820 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536922 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.536954 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537076 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kcjxn\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537085 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.537165 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537125 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wfwz5\"" Apr 22 19:58:16.537650 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537196 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:58:16.537650 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537270 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:58:16.537650 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537297 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-snqz7\"" Apr 22 19:58:16.537650 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537317 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.537650 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537432 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:58:16.537650 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537591 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.537925 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537861 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.537992 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.537972 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cktzz\"" Apr 22 19:58:16.538084 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.538043 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.538430 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.538216 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:58:16.538430 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.538267 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.541499 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.540181 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:16.541499 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.540253 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:16.541499 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.540423 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.542534 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.542514 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:58:16.542629 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.542550 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.545212 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.544951 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:58:16.545857 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.545839 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.546204 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.546186 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:58:16.548034 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.548012 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:58:16.548523 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.548261 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.548523 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.548318 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6c4lr\"" Apr 22 19:58:16.548523 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.548265 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:58:16.548523 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.548363 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.548727 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.548550 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-j5lfq\"" Apr 22 19:58:16.548727 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.548660 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:58:16.550807 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.550783 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-etc-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.550903 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.550820 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-run\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.550903 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.550841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-device-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.550903 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.550866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-cni-binary-copy\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.550903 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.550891 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.550916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-log-socket\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.550937 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-lib-modules\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.550975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cni-binary-copy\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-tmp-dir\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-cni-bin\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-kubernetes\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-host\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551099 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-tuned\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551115 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa7bef42-1723-430a-9e10-77a5e4e702fb-tmp\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.551163 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551185 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551245 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-system-cni-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551292 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-var-lib-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mzvx\" (UniqueName: \"kubernetes.io/projected/58fec03b-a906-4fea-8253-f50ea8e5684b-kube-api-access-9mzvx\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-cni-bin\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551419 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-cni-multus\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551459 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cnibin\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551488 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-cni-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551533 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-cnibin\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-k8s-cni-cncf-io\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-daemon-config\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.551637 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551637 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hs7p\" (UniqueName: \"kubernetes.io/projected/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-kube-api-access-7hs7p\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-run-netns\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d527q\" (UniqueName: \"kubernetes.io/projected/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-kube-api-access-d527q\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-systemd\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551764 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-ovn\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsj97\" (UniqueName: \"kubernetes.io/projected/fa7bef42-1723-430a-9e10-77a5e4e702fb-kube-api-access-bsj97\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clbtr\" (UniqueName: \"kubernetes.io/projected/4ae3c039-dc7f-4fc6-923b-489950577bc6-kube-api-access-clbtr\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f2e80311-0394-4386-8366-ef53a6861178-agent-certs\") pod \"konnectivity-agent-4jvwg\" (UID: \"f2e80311-0394-4386-8366-ef53a6861178\") " pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovnkube-script-lib\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551959 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f2e80311-0394-4386-8366-ef53a6861178-konnectivity-ca\") pod \"konnectivity-agent-4jvwg\" (UID: \"f2e80311-0394-4386-8366-ef53a6861178\") " pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjrpj\" (UniqueName: \"kubernetes.io/projected/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-kube-api-access-bjrpj\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.551989 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7hdm\" (UniqueName: \"kubernetes.io/projected/799125ad-367b-42aa-a2e1-222e89529bac-kube-api-access-m7hdm\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552009 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpq4h\" (UniqueName: \"kubernetes.io/projected/f5b56a04-ef22-4b41-8aa2-34438e2003fe-kube-api-access-vpq4h\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552032 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-kubelet\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.552277 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552054 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-slash\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-conf-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-run-ovn-kubernetes\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-systemd\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552253 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ae3c039-dc7f-4fc6-923b-489950577bc6-iptables-alerter-script\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552281 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-kubelet\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552304 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysconfig\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799125ad-367b-42aa-a2e1-222e89529bac-host\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552372 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-os-release\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-hosts-file\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552485 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-node-log\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-cni-netd\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552564 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8hzk\" (UniqueName: \"kubernetes.io/projected/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-kube-api-access-q8hzk\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552588 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-socket-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.552943 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552613 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-socket-dir-parent\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552639 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0402063-8f80-4f0b-8247-b3bd2ae51e51-kubelet-config\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysctl-conf\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552674 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ae3c039-dc7f-4fc6-923b-489950577bc6-host-slash\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552688 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552720 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-systemd-units\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552754 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovnkube-config\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysctl-d\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-registration-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-sys-fs\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552817 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-netns\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552852 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-multus-certs\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552895 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-etc-kubernetes\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552917 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0402063-8f80-4f0b-8247-b3bd2ae51e51-dbus\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-modprobe-d\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.553566 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799125ad-367b-42aa-a2e1-222e89529bac-serviceca\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.554127 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.552979 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-hostroot\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.554127 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.553008 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-env-overrides\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.554127 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.553031 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovn-node-metrics-cert\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.554127 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.553052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-sys\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.554127 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.553067 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-var-lib-kubelet\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.554127 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.553087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-system-cni-dir\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.554127 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.553109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-os-release\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.581318 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.581290 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:53:15 +0000 UTC" deadline="2027-11-17 13:08:35.934008563 +0000 UTC" Apr 22 19:58:16.581318 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.581317 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13769h10m19.352694761s" Apr 22 19:58:16.654298 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-slash\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.654298 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-conf-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.654508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-run-ovn-kubernetes\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.654508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654366 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-slash\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.654508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654371 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-conf-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.654508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-run-ovn-kubernetes\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.654508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-systemd\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.654508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ae3c039-dc7f-4fc6-923b-489950577bc6-iptables-alerter-script\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.654508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-kubelet\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysconfig\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-systemd\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654599 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysconfig\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799125ad-367b-42aa-a2e1-222e89529bac-host\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-os-release\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654689 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799125ad-367b-42aa-a2e1-222e89529bac-host\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654691 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-os-release\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-hosts-file\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.654836 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-node-log\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.657925 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654630 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-etc-selinux\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.657925 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.656023 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-cni-netd\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.657925 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.654670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-kubelet\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.657925 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.656064 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-hosts-file\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.657925 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.656125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-node-log\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.657925 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.656303 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ae3c039-dc7f-4fc6-923b-489950577bc6-iptables-alerter-script\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.658242 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.657943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.658278 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-cni-netd\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.658316 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658288 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8hzk\" (UniqueName: \"kubernetes.io/projected/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-kube-api-access-q8hzk\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.658352 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-socket-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.658352 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658335 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-socket-dir-parent\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.658436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0402063-8f80-4f0b-8247-b3bd2ae51e51-kubelet-config\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.658436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysctl-conf\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.658436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ae3c039-dc7f-4fc6-923b-489950577bc6-host-slash\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.658436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.658586 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658462 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.658586 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658483 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-systemd-units\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.658586 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658499 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovnkube-config\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.658586 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysctl-d\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.658586 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658520 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658593 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ae3c039-dc7f-4fc6-923b-489950577bc6-host-slash\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-registration-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-socket-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-sys-fs\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-netns\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-registration-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-socket-dir-parent\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-multus-certs\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658710 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-etc-kubernetes\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-sys-fs\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-systemd-units\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658745 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0402063-8f80-4f0b-8247-b3bd2ae51e51-kubelet-config\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.658803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658757 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0402063-8f80-4f0b-8247-b3bd2ae51e51-dbus\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.659244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658844 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysctl-conf\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.659244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0402063-8f80-4f0b-8247-b3bd2ae51e51-dbus\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.659244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658892 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-netns\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.659244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-etc-kubernetes\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.659244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.658955 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-multus-certs\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.659244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659035 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-sysctl-d\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.659244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659121 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovnkube-config\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.659244 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-modprobe-d\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659625 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-modprobe-d\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799125ad-367b-42aa-a2e1-222e89529bac-serviceca\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-hostroot\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659713 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-env-overrides\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659738 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovn-node-metrics-cert\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659764 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-sys\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-var-lib-kubelet\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659828 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-var-lib-kubelet\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-system-cni-dir\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659868 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-hostroot\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659871 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-os-release\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-etc-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-run\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659948 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-device-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799125ad-367b-42aa-a2e1-222e89529bac-serviceca\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.659972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-cni-binary-copy\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.660849 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660017 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660044 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-log-socket\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-lib-modules\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660093 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cni-binary-copy\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-tmp-dir\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660161 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-run\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-cni-bin\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660198 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-cni-bin\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-kubernetes\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-host\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660242 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-kubernetes\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-lib-modules\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-tuned\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660288 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-etc-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660291 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-log-socket\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa7bef42-1723-430a-9e10-77a5e4e702fb-tmp\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660361 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.661885 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.660381 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-system-cni-dir\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-env-overrides\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.660442 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret podName:d0402063-8f80-4f0b-8247-b3bd2ae51e51 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.160423058 +0000 UTC m=+3.114785853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret") pod "global-pull-secret-syncer-vlm4l" (UID: "d0402063-8f80-4f0b-8247-b3bd2ae51e51") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-cni-binary-copy\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660505 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/58fec03b-a906-4fea-8253-f50ea8e5684b-device-dir\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-system-cni-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-system-cni-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660507 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-os-release\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660548 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-var-lib-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660556 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-host\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa7bef42-1723-430a-9e10-77a5e4e702fb-sys\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-var-lib-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.660597 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-tmp-dir\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.662647 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660630 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mzvx\" (UniqueName: \"kubernetes.io/projected/58fec03b-a906-4fea-8253-f50ea8e5684b-kube-api-access-9mzvx\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.660664 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs podName:56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.16064719 +0000 UTC m=+3.115009969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs") pod "network-metrics-daemon-p2gjc" (UID: "56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660682 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-cni-bin\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660710 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-cni-multus\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660713 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660723 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-cni-bin\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cnibin\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660762 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-cni-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660769 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-cni-multus\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cnibin\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660809 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-cni-dir\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-cnibin\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-k8s-cni-cncf-io\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660893 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-daemon-config\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660910 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-cnibin\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hs7p\" (UniqueName: \"kubernetes.io/projected/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-kube-api-access-7hs7p\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660931 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f5b56a04-ef22-4b41-8aa2-34438e2003fe-cni-binary-copy\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.663436 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-run-netns\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-run-k8s-cni-cncf-io\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d527q\" (UniqueName: \"kubernetes.io/projected/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-kube-api-access-d527q\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.660995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-systemd\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-run-netns\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-ovn\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661032 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsj97\" (UniqueName: \"kubernetes.io/projected/fa7bef42-1723-430a-9e10-77a5e4e702fb-kube-api-access-bsj97\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661047 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clbtr\" (UniqueName: \"kubernetes.io/projected/4ae3c039-dc7f-4fc6-923b-489950577bc6-kube-api-access-clbtr\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f2e80311-0394-4386-8366-ef53a6861178-agent-certs\") pod \"konnectivity-agent-4jvwg\" (UID: \"f2e80311-0394-4386-8366-ef53a6861178\") " pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovnkube-script-lib\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661189 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f2e80311-0394-4386-8366-ef53a6861178-konnectivity-ca\") pod \"konnectivity-agent-4jvwg\" (UID: \"f2e80311-0394-4386-8366-ef53a6861178\") " pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661211 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjrpj\" (UniqueName: \"kubernetes.io/projected/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-kube-api-access-bjrpj\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7hdm\" (UniqueName: \"kubernetes.io/projected/799125ad-367b-42aa-a2e1-222e89529bac-kube-api-access-m7hdm\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpq4h\" (UniqueName: \"kubernetes.io/projected/f5b56a04-ef22-4b41-8aa2-34438e2003fe-kube-api-access-vpq4h\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661274 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-systemd\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664168 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-kubelet\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661299 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-ovn\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-host-var-lib-kubelet\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661341 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-multus-daemon-config\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661722 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-run-openvswitch\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661765 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.661798 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovnkube-script-lib\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.662453 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f2e80311-0394-4386-8366-ef53a6861178-konnectivity-ca\") pod \"konnectivity-agent-4jvwg\" (UID: \"f2e80311-0394-4386-8366-ef53a6861178\") " pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.664184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fa7bef42-1723-430a-9e10-77a5e4e702fb-tmp\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.664237 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fa7bef42-1723-430a-9e10-77a5e4e702fb-etc-tuned\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.664973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.664503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f2e80311-0394-4386-8366-ef53a6861178-agent-certs\") pod \"konnectivity-agent-4jvwg\" (UID: \"f2e80311-0394-4386-8366-ef53a6861178\") " pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:16.667640 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.667617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-ovn-node-metrics-cert\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.682052 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.682025 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:16.682052 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.682049 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:16.682235 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.682061 2579 projected.go:194] Error preparing data for projected volume kube-api-access-xx55h for pod openshift-network-diagnostics/network-check-target-64jvm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:16.682235 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:16.682120 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h podName:ebb887c1-8ca2-4d72-9902-320466d0dce7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.182104691 +0000 UTC m=+3.136467473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xx55h" (UniqueName: "kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h") pod "network-check-target-64jvm" (UID: "ebb887c1-8ca2-4d72-9902-320466d0dce7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:16.682235 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.682171 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hs7p\" (UniqueName: \"kubernetes.io/projected/76458b53-1eb2-41d5-b4c3-01ca91b6f22d-kube-api-access-7hs7p\") pod \"multus-5dh6n\" (UID: \"76458b53-1eb2-41d5-b4c3-01ca91b6f22d\") " pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.683722 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.683703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8hzk\" (UniqueName: \"kubernetes.io/projected/8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c-kube-api-access-q8hzk\") pod \"ovnkube-node-cw48v\" (UID: \"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:16.685810 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.685787 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mzvx\" (UniqueName: \"kubernetes.io/projected/58fec03b-a906-4fea-8253-f50ea8e5684b-kube-api-access-9mzvx\") pod \"aws-ebs-csi-driver-node-7fcnh\" (UID: \"58fec03b-a906-4fea-8253-f50ea8e5684b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.685913 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.685870 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7hdm\" (UniqueName: \"kubernetes.io/projected/799125ad-367b-42aa-a2e1-222e89529bac-kube-api-access-m7hdm\") pod \"node-ca-xzq74\" (UID: \"799125ad-367b-42aa-a2e1-222e89529bac\") " pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.688973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.688945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d527q\" (UniqueName: \"kubernetes.io/projected/603cfe2d-1c53-4bc8-acc0-b1d1751c2817-kube-api-access-d527q\") pod \"node-resolver-vk6km\" (UID: \"603cfe2d-1c53-4bc8-acc0-b1d1751c2817\") " pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.689395 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.689372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsj97\" (UniqueName: \"kubernetes.io/projected/fa7bef42-1723-430a-9e10-77a5e4e702fb-kube-api-access-bsj97\") pod \"tuned-lt5hj\" (UID: \"fa7bef42-1723-430a-9e10-77a5e4e702fb\") " pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.689395 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.689392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpq4h\" (UniqueName: \"kubernetes.io/projected/f5b56a04-ef22-4b41-8aa2-34438e2003fe-kube-api-access-vpq4h\") pod \"multus-additional-cni-plugins-86nst\" (UID: \"f5b56a04-ef22-4b41-8aa2-34438e2003fe\") " pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.690205 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.690173 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjrpj\" (UniqueName: \"kubernetes.io/projected/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-kube-api-access-bjrpj\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:16.690683 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.690662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clbtr\" (UniqueName: \"kubernetes.io/projected/4ae3c039-dc7f-4fc6-923b-489950577bc6-kube-api-access-clbtr\") pod \"iptables-alerter-qqmqh\" (UID: \"4ae3c039-dc7f-4fc6-923b-489950577bc6\") " pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.837973 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.837879 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" Apr 22 19:58:16.848723 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.848700 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vk6km" Apr 22 19:58:16.856329 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.856304 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-86nst" Apr 22 19:58:16.860929 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.860910 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" Apr 22 19:58:16.868574 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.868557 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qqmqh" Apr 22 19:58:16.874055 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.874035 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:16.881818 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.881802 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5dh6n" Apr 22 19:58:16.888049 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.888033 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xzq74" Apr 22 19:58:16.891775 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:16.891755 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:17.165097 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.165050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:17.165276 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.165108 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:17.165276 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:17.165222 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:17.165401 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:17.165309 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret podName:d0402063-8f80-4f0b-8247-b3bd2ae51e51 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.165293147 +0000 UTC m=+4.119655926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret") pod "global-pull-secret-syncer-vlm4l" (UID: "d0402063-8f80-4f0b-8247-b3bd2ae51e51") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:17.165401 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:17.165227 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:17.165401 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:17.165375 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs podName:56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.165359491 +0000 UTC m=+4.119722272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs") pod "network-metrics-daemon-p2gjc" (UID: "56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:17.228638 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.228610 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e80311_0394_4386_8366_ef53a6861178.slice/crio-2b70a3c4cc455657a6deffad72d1c3d405ffeac73aa97a018c9069fe76f4bbb2 WatchSource:0}: Error finding container 2b70a3c4cc455657a6deffad72d1c3d405ffeac73aa97a018c9069fe76f4bbb2: Status 404 returned error can't find the container with id 2b70a3c4cc455657a6deffad72d1c3d405ffeac73aa97a018c9069fe76f4bbb2 Apr 22 19:58:17.231661 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.231585 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae3c039_dc7f_4fc6_923b_489950577bc6.slice/crio-6bef138be90aae4c73e8c0a7214f0e38dfd43007009d56e8a85680934127bea4 WatchSource:0}: Error finding container 6bef138be90aae4c73e8c0a7214f0e38dfd43007009d56e8a85680934127bea4: Status 404 returned error can't find the container with id 6bef138be90aae4c73e8c0a7214f0e38dfd43007009d56e8a85680934127bea4 Apr 22 19:58:17.233853 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.233834 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod799125ad_367b_42aa_a2e1_222e89529bac.slice/crio-dfd24a584128da745843fb79c2b3c896fb31475a31771f3d3bbaf22844fc0133 WatchSource:0}: Error finding container dfd24a584128da745843fb79c2b3c896fb31475a31771f3d3bbaf22844fc0133: Status 404 returned error can't find the container with id dfd24a584128da745843fb79c2b3c896fb31475a31771f3d3bbaf22844fc0133 Apr 22 19:58:17.234606 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.234582 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b56a04_ef22_4b41_8aa2_34438e2003fe.slice/crio-41b957c7c9d34b7b77d98cb2ab1a8faeb946c9b7b51778d2238d3fd870e3dd4f WatchSource:0}: Error finding container 41b957c7c9d34b7b77d98cb2ab1a8faeb946c9b7b51778d2238d3fd870e3dd4f: Status 404 returned error can't find the container with id 41b957c7c9d34b7b77d98cb2ab1a8faeb946c9b7b51778d2238d3fd870e3dd4f Apr 22 19:58:17.236007 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.235985 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod603cfe2d_1c53_4bc8_acc0_b1d1751c2817.slice/crio-e11baf2ba2f767f8e94a393fc105a719f0c2d41c1490091a14b49805617a7e30 WatchSource:0}: Error finding container e11baf2ba2f767f8e94a393fc105a719f0c2d41c1490091a14b49805617a7e30: Status 404 returned error can't find the container with id e11baf2ba2f767f8e94a393fc105a719f0c2d41c1490091a14b49805617a7e30 Apr 22 19:58:17.236606 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.236585 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa7bef42_1723_430a_9e10_77a5e4e702fb.slice/crio-ff27416fc7846d4d9319a245aeec9323ee2b30f08426363e0c3cec164b18a8ed WatchSource:0}: Error finding container ff27416fc7846d4d9319a245aeec9323ee2b30f08426363e0c3cec164b18a8ed: Status 404 returned error can't find the container with id ff27416fc7846d4d9319a245aeec9323ee2b30f08426363e0c3cec164b18a8ed Apr 22 19:58:17.237836 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.237789 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58fec03b_a906_4fea_8253_f50ea8e5684b.slice/crio-0d7fcf5155c0c78d38e80e4f76d0a2be41bdb21a47e05ef7b69fe5470350ed5e WatchSource:0}: Error finding container 0d7fcf5155c0c78d38e80e4f76d0a2be41bdb21a47e05ef7b69fe5470350ed5e: Status 404 returned error can't find the container with id 0d7fcf5155c0c78d38e80e4f76d0a2be41bdb21a47e05ef7b69fe5470350ed5e Apr 22 19:58:17.238617 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.238574 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76458b53_1eb2_41d5_b4c3_01ca91b6f22d.slice/crio-36f6c54fcccb27d5b7f629c01ab91b828a32d6138c7850d8f3cea376dbf8b5a0 WatchSource:0}: Error finding container 36f6c54fcccb27d5b7f629c01ab91b828a32d6138c7850d8f3cea376dbf8b5a0: Status 404 returned error can't find the container with id 36f6c54fcccb27d5b7f629c01ab91b828a32d6138c7850d8f3cea376dbf8b5a0 Apr 22 19:58:17.240003 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:17.239980 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c76a5bd_34e5_4bbf_ae8c_4e89ca39908c.slice/crio-d56d732b3fc110093f561bdbcbe443f676a1089cc0fa6ac6db73e146556ffd21 WatchSource:0}: Error finding container d56d732b3fc110093f561bdbcbe443f676a1089cc0fa6ac6db73e146556ffd21: Status 404 returned error can't find the container with id d56d732b3fc110093f561bdbcbe443f676a1089cc0fa6ac6db73e146556ffd21 Apr 22 19:58:17.266242 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.266221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:17.266373 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:17.266355 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:17.266405 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:17.266382 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:17.266405 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:17.266396 2579 projected.go:194] Error preparing data for projected volume kube-api-access-xx55h for pod openshift-network-diagnostics/network-check-target-64jvm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:17.266506 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:17.266494 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h podName:ebb887c1-8ca2-4d72-9902-320466d0dce7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.266451692 +0000 UTC m=+4.220814475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xx55h" (UniqueName: "kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h") pod "network-check-target-64jvm" (UID: "ebb887c1-8ca2-4d72-9902-320466d0dce7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:17.581692 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.581598 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:53:15 +0000 UTC" deadline="2027-09-27 19:42:28.351430643 +0000 UTC" Apr 22 19:58:17.581692 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.581638 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12551h44m10.769796021s" Apr 22 19:58:17.700661 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.700622 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5dh6n" event={"ID":"76458b53-1eb2-41d5-b4c3-01ca91b6f22d","Type":"ContainerStarted","Data":"36f6c54fcccb27d5b7f629c01ab91b828a32d6138c7850d8f3cea376dbf8b5a0"} Apr 22 19:58:17.710837 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.710802 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" event={"ID":"58fec03b-a906-4fea-8253-f50ea8e5684b","Type":"ContainerStarted","Data":"0d7fcf5155c0c78d38e80e4f76d0a2be41bdb21a47e05ef7b69fe5470350ed5e"} Apr 22 19:58:17.713237 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.713209 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerStarted","Data":"41b957c7c9d34b7b77d98cb2ab1a8faeb946c9b7b51778d2238d3fd870e3dd4f"} Apr 22 19:58:17.719497 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.719471 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vk6km" event={"ID":"603cfe2d-1c53-4bc8-acc0-b1d1751c2817","Type":"ContainerStarted","Data":"e11baf2ba2f767f8e94a393fc105a719f0c2d41c1490091a14b49805617a7e30"} Apr 22 19:58:17.725743 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.725696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xzq74" event={"ID":"799125ad-367b-42aa-a2e1-222e89529bac","Type":"ContainerStarted","Data":"dfd24a584128da745843fb79c2b3c896fb31475a31771f3d3bbaf22844fc0133"} Apr 22 19:58:17.733378 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.733353 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qqmqh" event={"ID":"4ae3c039-dc7f-4fc6-923b-489950577bc6","Type":"ContainerStarted","Data":"6bef138be90aae4c73e8c0a7214f0e38dfd43007009d56e8a85680934127bea4"} Apr 22 19:58:17.741198 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.741174 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"d56d732b3fc110093f561bdbcbe443f676a1089cc0fa6ac6db73e146556ffd21"} Apr 22 19:58:17.755893 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.755846 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" event={"ID":"fa7bef42-1723-430a-9e10-77a5e4e702fb","Type":"ContainerStarted","Data":"ff27416fc7846d4d9319a245aeec9323ee2b30f08426363e0c3cec164b18a8ed"} Apr 22 19:58:17.771979 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.771952 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4jvwg" event={"ID":"f2e80311-0394-4386-8366-ef53a6861178","Type":"ContainerStarted","Data":"2b70a3c4cc455657a6deffad72d1c3d405ffeac73aa97a018c9069fe76f4bbb2"} Apr 22 19:58:17.784999 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.784504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" event={"ID":"6551c028e13aff466c38397d8a508ac4","Type":"ContainerStarted","Data":"5846fa10db54963970806ce13267a61a92c22e64da51f4579245168e5815c5c3"} Apr 22 19:58:17.800821 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:17.800775 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" podStartSLOduration=1.800760001 podStartE2EDuration="1.800760001s" podCreationTimestamp="2026-04-22 19:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:17.799971336 +0000 UTC m=+3.754334139" watchObservedRunningTime="2026-04-22 19:58:17.800760001 +0000 UTC m=+3.755122811" Apr 22 19:58:18.177635 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:18.175546 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:18.177635 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:18.175597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:18.177635 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.175825 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:18.177635 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.175851 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:18.177635 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.175898 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret podName:d0402063-8f80-4f0b-8247-b3bd2ae51e51 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.175879144 +0000 UTC m=+6.130241930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret") pod "global-pull-secret-syncer-vlm4l" (UID: "d0402063-8f80-4f0b-8247-b3bd2ae51e51") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:18.177635 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.175922 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs podName:56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.175911453 +0000 UTC m=+6.130274241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs") pod "network-metrics-daemon-p2gjc" (UID: "56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:18.276687 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:18.276654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:18.277097 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.276805 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:18.277097 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.276826 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:18.277097 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.276838 2579 projected.go:194] Error preparing data for projected volume kube-api-access-xx55h for pod openshift-network-diagnostics/network-check-target-64jvm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:18.277097 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.276888 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h podName:ebb887c1-8ca2-4d72-9902-320466d0dce7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.276870162 +0000 UTC m=+6.231232948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xx55h" (UniqueName: "kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h") pod "network-check-target-64jvm" (UID: "ebb887c1-8ca2-4d72-9902-320466d0dce7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:18.688875 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:18.688845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:18.689382 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:18.688845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:18.689382 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.688983 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:18.689382 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.689083 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:18.689382 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:18.689128 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:18.689382 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:18.689209 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:18.805041 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:18.805005 2579 generic.go:358] "Generic (PLEG): container finished" podID="ea9c91cb8023c6edf0f87546014505a5" containerID="1bdcd3f6a726742f2262e1f70be9675a47d2f8734d7b77b818338f7fa2227f21" exitCode=0 Apr 22 19:58:18.805226 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:18.805126 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" event={"ID":"ea9c91cb8023c6edf0f87546014505a5","Type":"ContainerDied","Data":"1bdcd3f6a726742f2262e1f70be9675a47d2f8734d7b77b818338f7fa2227f21"} Apr 22 19:58:19.812671 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:19.812628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" event={"ID":"ea9c91cb8023c6edf0f87546014505a5","Type":"ContainerStarted","Data":"396c8ff2326f13f1567a27c28f852f85062394b6d8c792cc32bdfb3e280f6e74"} Apr 22 19:58:20.192273 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:20.192233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:20.192442 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:20.192296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:20.192511 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.192447 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:20.192567 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.192534 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs podName:56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.192515051 +0000 UTC m=+10.146877850 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs") pod "network-metrics-daemon-p2gjc" (UID: "56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:20.192932 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.192914 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:20.193001 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.192967 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret podName:d0402063-8f80-4f0b-8247-b3bd2ae51e51 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.192953778 +0000 UTC m=+10.147316563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret") pod "global-pull-secret-syncer-vlm4l" (UID: "d0402063-8f80-4f0b-8247-b3bd2ae51e51") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:20.293545 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:20.293505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:20.293736 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.293668 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:20.293736 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.293694 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:20.293736 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.293707 2579 projected.go:194] Error preparing data for projected volume kube-api-access-xx55h for pod openshift-network-diagnostics/network-check-target-64jvm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:20.293901 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.293765 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h podName:ebb887c1-8ca2-4d72-9902-320466d0dce7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.293745717 +0000 UTC m=+10.248108498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xx55h" (UniqueName: "kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h") pod "network-check-target-64jvm" (UID: "ebb887c1-8ca2-4d72-9902-320466d0dce7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:20.691164 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:20.689301 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:20.691164 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.689435 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:20.691164 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:20.689846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:20.691164 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.689917 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:20.691164 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:20.689982 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:20.691164 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:20.690025 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:22.688994 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:22.688952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:22.689453 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:22.689086 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:22.689453 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:22.689176 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:22.689453 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:22.689247 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:22.689453 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:22.689309 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:22.689453 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:22.689388 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:24.222439 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:24.222399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:24.222908 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:24.222452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:24.222908 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.222541 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:24.222908 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.222584 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:24.222908 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.222601 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret podName:d0402063-8f80-4f0b-8247-b3bd2ae51e51 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.222587011 +0000 UTC m=+18.176949789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret") pod "global-pull-secret-syncer-vlm4l" (UID: "d0402063-8f80-4f0b-8247-b3bd2ae51e51") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:24.222908 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.222638 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs podName:56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.222623035 +0000 UTC m=+18.176985830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs") pod "network-metrics-daemon-p2gjc" (UID: "56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:24.323453 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:24.323415 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:24.323633 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.323617 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:24.323693 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.323641 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:24.323693 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.323654 2579 projected.go:194] Error preparing data for projected volume kube-api-access-xx55h for pod openshift-network-diagnostics/network-check-target-64jvm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:24.323823 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.323715 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h podName:ebb887c1-8ca2-4d72-9902-320466d0dce7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.323696336 +0000 UTC m=+18.278059118 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xx55h" (UniqueName: "kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h") pod "network-check-target-64jvm" (UID: "ebb887c1-8ca2-4d72-9902-320466d0dce7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:24.691888 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:24.691844 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:24.691888 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:24.691881 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:24.692161 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.691974 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:24.692161 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.692053 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:24.692161 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:24.692124 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:24.692414 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:24.692233 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:26.689417 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:26.689331 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:26.689417 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:26.689356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:26.689898 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:26.689347 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:26.689898 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:26.689459 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:26.689898 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:26.689562 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:26.689898 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:26.689675 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:28.691643 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:28.691615 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:28.691643 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:28.691645 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:28.691643 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:28.691659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:28.692126 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:28.691749 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:28.692126 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:28.691888 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:28.692126 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:28.691985 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:30.689666 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:30.689244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:30.689666 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:30.689366 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:30.689666 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:30.689364 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:30.689666 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:30.689464 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:30.689666 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:30.689501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:30.689666 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:30.689613 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:32.277528 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:32.277490 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:32.277528 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:32.277534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:32.278075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.277611 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:32.278075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.277644 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:32.278075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.277704 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret podName:d0402063-8f80-4f0b-8247-b3bd2ae51e51 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.277670427 +0000 UTC m=+34.232033211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret") pod "global-pull-secret-syncer-vlm4l" (UID: "d0402063-8f80-4f0b-8247-b3bd2ae51e51") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:32.278075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.277725 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs podName:56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.277715046 +0000 UTC m=+34.232077825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs") pod "network-metrics-daemon-p2gjc" (UID: "56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:32.378815 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:32.378789 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:32.378977 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.378917 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:32.378977 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.378937 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:32.378977 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.378948 2579 projected.go:194] Error preparing data for projected volume kube-api-access-xx55h for pod openshift-network-diagnostics/network-check-target-64jvm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:32.379084 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.379007 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h podName:ebb887c1-8ca2-4d72-9902-320466d0dce7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.378989384 +0000 UTC m=+34.333352163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xx55h" (UniqueName: "kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h") pod "network-check-target-64jvm" (UID: "ebb887c1-8ca2-4d72-9902-320466d0dce7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:32.691889 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:32.691860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:32.691889 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:32.691875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:32.692103 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:32.691864 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:32.692103 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.691965 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:32.692103 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.692054 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:32.692254 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:32.692156 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:34.693337 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.693092 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:34.694084 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.693196 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:34.694084 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:34.693444 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:34.694084 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:34.693527 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:34.694084 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.693208 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:34.694084 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:34.693637 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:34.836545 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.836518 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" event={"ID":"fa7bef42-1723-430a-9e10-77a5e4e702fb","Type":"ContainerStarted","Data":"2983202e37e2c7d2ec8b72f150aac42a1e0f5586d5dda0208ab680cac6d8449f"} Apr 22 19:58:34.837717 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.837681 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4jvwg" event={"ID":"f2e80311-0394-4386-8366-ef53a6861178","Type":"ContainerStarted","Data":"61d955fa7ed398b151a7224f47500cf56d3457bb050b1a833b31127622e7353d"} Apr 22 19:58:34.838922 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.838904 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5dh6n" event={"ID":"76458b53-1eb2-41d5-b4c3-01ca91b6f22d","Type":"ContainerStarted","Data":"7eb29649c5a8045403e69cffd481a349627f1356e982f9b9779137852b2988f8"} Apr 22 19:58:34.840086 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.840066 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" event={"ID":"58fec03b-a906-4fea-8253-f50ea8e5684b","Type":"ContainerStarted","Data":"4ec66722b45bff6a54e8cbf56411662118643a83c7f355d8be0e38fd025ec2f8"} Apr 22 19:58:34.841391 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.841370 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerStarted","Data":"0172990ad67ea115b0fff948d28f6224baf63625e04aac231af450a260784ee5"} Apr 22 19:58:34.842703 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.842680 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vk6km" event={"ID":"603cfe2d-1c53-4bc8-acc0-b1d1751c2817","Type":"ContainerStarted","Data":"d65fc3f2c9efee1c6b165a26b1c0dffee7d658eb254e79cf02172b11edf5a48f"} Apr 22 19:58:34.843764 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.843745 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xzq74" event={"ID":"799125ad-367b-42aa-a2e1-222e89529bac","Type":"ContainerStarted","Data":"cd41c7adf58a9c034ad966801a490ac851e4fb217554aca47176ceef751a7519"} Apr 22 19:58:34.859443 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.859409 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" podStartSLOduration=18.859385065 podStartE2EDuration="18.859385065s" podCreationTimestamp="2026-04-22 19:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:19.827994318 +0000 UTC m=+5.782357119" watchObservedRunningTime="2026-04-22 19:58:34.859385065 +0000 UTC m=+20.813747844" Apr 22 19:58:34.860006 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.859984 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lt5hj" podStartSLOduration=3.690232061 podStartE2EDuration="20.859977007s" podCreationTimestamp="2026-04-22 19:58:14 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.239661298 +0000 UTC m=+3.194024079" lastFinishedPulling="2026-04-22 19:58:34.409406232 +0000 UTC m=+20.363769025" observedRunningTime="2026-04-22 19:58:34.859489891 +0000 UTC m=+20.813852692" watchObservedRunningTime="2026-04-22 19:58:34.859977007 +0000 UTC m=+20.814339838" Apr 22 19:58:34.899947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.899907 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vk6km" podStartSLOduration=3.728853725 podStartE2EDuration="20.899896735s" podCreationTimestamp="2026-04-22 19:58:14 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.23803116 +0000 UTC m=+3.192393939" lastFinishedPulling="2026-04-22 19:58:34.409074156 +0000 UTC m=+20.363436949" observedRunningTime="2026-04-22 19:58:34.898877586 +0000 UTC m=+20.853240388" watchObservedRunningTime="2026-04-22 19:58:34.899896735 +0000 UTC m=+20.854259535" Apr 22 19:58:34.926206 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.926126 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xzq74" podStartSLOduration=2.752760943 podStartE2EDuration="19.926107973s" podCreationTimestamp="2026-04-22 19:58:15 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.235720481 +0000 UTC m=+3.190083259" lastFinishedPulling="2026-04-22 19:58:34.409067492 +0000 UTC m=+20.363430289" observedRunningTime="2026-04-22 19:58:34.922890268 +0000 UTC m=+20.877253070" watchObservedRunningTime="2026-04-22 19:58:34.926107973 +0000 UTC m=+20.880470778" Apr 22 19:58:34.949477 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.949311 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4jvwg" podStartSLOduration=3.7706343589999998 podStartE2EDuration="20.94929762s" podCreationTimestamp="2026-04-22 19:58:14 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.23040767 +0000 UTC m=+3.184770449" lastFinishedPulling="2026-04-22 19:58:34.40907093 +0000 UTC m=+20.363433710" observedRunningTime="2026-04-22 19:58:34.949098375 +0000 UTC m=+20.903461177" watchObservedRunningTime="2026-04-22 19:58:34.94929762 +0000 UTC m=+20.903660421" Apr 22 19:58:34.971213 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:34.971112 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5dh6n" podStartSLOduration=3.762103683 podStartE2EDuration="20.971096954s" podCreationTimestamp="2026-04-22 19:58:14 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.241198935 +0000 UTC m=+3.195561728" lastFinishedPulling="2026-04-22 19:58:34.450192218 +0000 UTC m=+20.404554999" observedRunningTime="2026-04-22 19:58:34.97065494 +0000 UTC m=+20.925017735" watchObservedRunningTime="2026-04-22 19:58:34.971096954 +0000 UTC m=+20.925459752" Apr 22 19:58:35.644122 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.644097 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:58:35.847790 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.847567 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" event={"ID":"58fec03b-a906-4fea-8253-f50ea8e5684b","Type":"ContainerStarted","Data":"4add86344f5e00d78ab6a01026acb2f83ce9e166a1c2706528eb20397ec2e008"} Apr 22 19:58:35.848746 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.848726 2579 generic.go:358] "Generic (PLEG): container finished" podID="f5b56a04-ef22-4b41-8aa2-34438e2003fe" containerID="0172990ad67ea115b0fff948d28f6224baf63625e04aac231af450a260784ee5" exitCode=0 Apr 22 19:58:35.848856 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.848754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerDied","Data":"0172990ad67ea115b0fff948d28f6224baf63625e04aac231af450a260784ee5"} Apr 22 19:58:35.850248 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.850165 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qqmqh" event={"ID":"4ae3c039-dc7f-4fc6-923b-489950577bc6","Type":"ContainerStarted","Data":"83366f0f5374e55a16d431cc547efc05051a2d5542b20c500104468aad9d00fc"} Apr 22 19:58:35.853059 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.853014 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"c99e00d0ad1c6b3352fa8f4c8ae5b5525c5dfad093be603e6384ed55b2e8e640"} Apr 22 19:58:35.853059 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.853047 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"4e5e3829f9dc284175f9d39a4733611f8d709f92fe376f7277ee7247720c7769"} Apr 22 19:58:35.853278 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.853064 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"6259821f65a67cff67f2d963d8b427db060f4611abc9ae42e9463ddac44a999c"} Apr 22 19:58:35.853278 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.853075 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"e5416819e602ca3b8f049da386d38506d51fc7a9b4980ee24b1a498b17d50351"} Apr 22 19:58:35.853278 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.853086 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"a96da617b444517b9b674843ee8eb40e7de4208304f58b7849a1714eb376fa1d"} Apr 22 19:58:35.853278 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:35.853097 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"a3d0f9dd29e91d1ebc775fc4239518de66a6808ad8871a1750b4e4483b136797"} Apr 22 19:58:36.638058 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.637899 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:58:35.644118428Z","UUID":"f932cf51-e042-4236-b4a4-39af070d3492","Handler":null,"Name":"","Endpoint":""} Apr 22 19:58:36.639666 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.639645 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:58:36.639789 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.639674 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:58:36.688670 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.688639 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:36.688809 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.688771 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:36.688809 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.688784 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:36.688928 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:36.688773 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:36.688928 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:36.688882 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:36.689031 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:36.688969 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:36.856577 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.856541 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" event={"ID":"58fec03b-a906-4fea-8253-f50ea8e5684b","Type":"ContainerStarted","Data":"7d18d3fed6e1f1e9eaeb8231861472c05caaafcad30a32aabd701c1e34934849"} Apr 22 19:58:36.873924 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.873883 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7fcnh" podStartSLOduration=3.744934635 podStartE2EDuration="22.873869854s" podCreationTimestamp="2026-04-22 19:58:14 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.240289235 +0000 UTC m=+3.194652014" lastFinishedPulling="2026-04-22 19:58:36.36922445 +0000 UTC m=+22.323587233" observedRunningTime="2026-04-22 19:58:36.873795881 +0000 UTC m=+22.828158682" watchObservedRunningTime="2026-04-22 19:58:36.873869854 +0000 UTC m=+22.828232655" Apr 22 19:58:36.874131 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:36.874100 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qqmqh" podStartSLOduration=5.69842598 podStartE2EDuration="22.874094744s" podCreationTimestamp="2026-04-22 19:58:14 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.233368694 +0000 UTC m=+3.187731472" lastFinishedPulling="2026-04-22 19:58:34.409037451 +0000 UTC m=+20.363400236" observedRunningTime="2026-04-22 19:58:35.886366247 +0000 UTC m=+21.840729048" watchObservedRunningTime="2026-04-22 19:58:36.874094744 +0000 UTC m=+22.828457579" Apr 22 19:58:37.796236 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:37.796203 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:37.796855 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:37.796831 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:37.861792 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:37.861765 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"115b60428fe6abff20193f42aeb20ad57b0e1b21f96b81d63947a137ee4e1989"} Apr 22 19:58:38.689285 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:38.689259 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:38.689465 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:38.689259 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:38.689465 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:38.689386 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:38.689465 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:38.689258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:38.689465 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:38.689454 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:38.689667 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:38.689568 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:39.872448 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:39.871164 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" event={"ID":"8c76a5bd-34e5-4bbf-ae8c-4e89ca39908c","Type":"ContainerStarted","Data":"3dfc6c3fd32b1077e07fe58400a9581f57bdf27a2da1164f26b6a62f76aace11"} Apr 22 19:58:39.872448 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:39.871482 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:39.872448 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:39.871506 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:39.872448 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:39.871520 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:39.890237 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:39.890203 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:39.892179 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:39.892033 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:58:39.901368 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:39.901229 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" podStartSLOduration=7.2351434 podStartE2EDuration="24.901211842s" podCreationTimestamp="2026-04-22 19:58:15 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.24146287 +0000 UTC m=+3.195825655" lastFinishedPulling="2026-04-22 19:58:34.907531309 +0000 UTC m=+20.861894097" observedRunningTime="2026-04-22 19:58:39.900170203 +0000 UTC m=+25.854532997" watchObservedRunningTime="2026-04-22 19:58:39.901211842 +0000 UTC m=+25.855574644" Apr 22 19:58:40.689204 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:40.689169 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:40.689204 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:40.689184 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:40.689204 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:40.689205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:40.689420 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:40.689272 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:40.689420 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:40.689338 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:40.689420 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:40.689380 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:40.873936 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:40.873906 2579 generic.go:358] "Generic (PLEG): container finished" podID="f5b56a04-ef22-4b41-8aa2-34438e2003fe" containerID="324f3c8ac4f1cc43ae1a079984547a334fe6f556310fa218a8063d715d7648fa" exitCode=0 Apr 22 19:58:40.874478 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:40.874000 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerDied","Data":"324f3c8ac4f1cc43ae1a079984547a334fe6f556310fa218a8063d715d7648fa"} Apr 22 19:58:41.723897 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:41.723871 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-64jvm"] Apr 22 19:58:41.724000 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:41.723959 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:41.724051 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:41.724033 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:41.727127 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:41.727107 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vlm4l"] Apr 22 19:58:41.727241 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:41.727201 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:41.727298 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:41.727281 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:41.727603 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:41.727580 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p2gjc"] Apr 22 19:58:41.727700 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:41.727663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:41.727787 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:41.727764 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:41.877897 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:41.877874 2579 generic.go:358] "Generic (PLEG): container finished" podID="f5b56a04-ef22-4b41-8aa2-34438e2003fe" containerID="5e1304222c6616837ded74b0256fc22f5ca746a5af21f570c7c1b09716ee3d52" exitCode=0 Apr 22 19:58:41.878273 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:41.877959 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerDied","Data":"5e1304222c6616837ded74b0256fc22f5ca746a5af21f570c7c1b09716ee3d52"} Apr 22 19:58:42.647669 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:42.647493 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:42.647787 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:42.647775 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:58:42.648097 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:42.648078 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4jvwg" Apr 22 19:58:42.881582 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:42.881547 2579 generic.go:358] "Generic (PLEG): container finished" podID="f5b56a04-ef22-4b41-8aa2-34438e2003fe" containerID="76cfbe9fbe848e02d748125288bcd07de92e269fcea2a67256b50cc924a0852a" exitCode=0 Apr 22 19:58:42.881925 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:42.881629 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerDied","Data":"76cfbe9fbe848e02d748125288bcd07de92e269fcea2a67256b50cc924a0852a"} Apr 22 19:58:43.688691 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:43.688654 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:43.688691 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:43.688686 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:43.688904 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:43.688776 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:43.688904 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:43.688830 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:43.688992 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:43.688923 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:43.689038 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:43.689003 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:45.688622 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:45.688582 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:45.689209 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:45.688693 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64jvm" podUID="ebb887c1-8ca2-4d72-9902-320466d0dce7" Apr 22 19:58:45.689209 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:45.688706 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:45.689209 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:45.688737 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:45.689209 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:45.688863 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vlm4l" podUID="d0402063-8f80-4f0b-8247-b3bd2ae51e51" Apr 22 19:58:45.689209 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:45.688946 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2gjc" podUID="56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728" Apr 22 19:58:47.367382 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.367355 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeReady" Apr 22 19:58:47.367942 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.367498 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:58:47.403284 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.403253 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-dffdb89c7-6k4v2"] Apr 22 19:58:47.428210 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.428179 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-dffdb89c7-6k4v2"] Apr 22 19:58:47.428210 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.428209 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4sbb4"] Apr 22 19:58:47.428414 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.428349 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.434250 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.431711 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:58:47.434250 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.432277 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:58:47.434250 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.432324 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:58:47.434250 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.432652 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v9fvc\"" Apr 22 19:58:47.444536 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.444470 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:58:47.447878 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.447857 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wm9mw"] Apr 22 19:58:47.448033 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.448015 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.450323 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.450304 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:58:47.450420 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.450321 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:58:47.450484 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.450467 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2xtr6\"" Apr 22 19:58:47.465112 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.465094 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4sbb4"] Apr 22 19:58:47.465220 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.465120 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wm9mw"] Apr 22 19:58:47.465264 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.465240 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:47.467133 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.467115 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:58:47.467250 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.467132 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:58:47.467546 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.467400 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g9zd7\"" Apr 22 19:58:47.467546 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.467437 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:58:47.598154 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t87n\" (UniqueName: \"kubernetes.io/projected/ea808e89-1697-4235-8c42-8202cc97fef9-kube-api-access-8t87n\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:47.598300 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c89560e-578e-4ca8-bb28-9608c190c546-tmp-dir\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.598300 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:47.598300 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.598300 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.598468 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598324 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f7a6a88-1728-49a9-a02a-3f66d4a19934-ca-trust-extracted\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.598468 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598351 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrrl6\" (UniqueName: \"kubernetes.io/projected/7c89560e-578e-4ca8-bb28-9608c190c546-kube-api-access-hrrl6\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.598468 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598380 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-trusted-ca\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.598468 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598409 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-installation-pull-secrets\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.598468 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-image-registry-private-configuration\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.598468 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598463 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-certificates\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.598679 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nx6s\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-kube-api-access-5nx6s\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.598679 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c89560e-578e-4ca8-bb28-9608c190c546-config-volume\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.598679 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.598567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-bound-sa-token\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.688517 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.688480 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:47.688517 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.688516 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:47.688734 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.688600 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:47.691157 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.691118 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:58:47.691286 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.691248 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:58:47.691286 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.691277 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:58:47.691396 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.691379 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vnzkf\"" Apr 22 19:58:47.691455 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.691392 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h4q6q\"" Apr 22 19:58:47.691505 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.691484 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:58:47.698945 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.698924 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-trusted-ca\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.699072 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.698963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-installation-pull-secrets\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.699072 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.698991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-image-registry-private-configuration\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.699195 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-certificates\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.699239 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nx6s\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-kube-api-access-5nx6s\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.699294 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699253 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c89560e-578e-4ca8-bb28-9608c190c546-config-volume\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.699344 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-bound-sa-token\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.699344 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699333 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t87n\" (UniqueName: \"kubernetes.io/projected/ea808e89-1697-4235-8c42-8202cc97fef9-kube-api-access-8t87n\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:47.699425 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699383 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c89560e-578e-4ca8-bb28-9608c190c546-tmp-dir\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.699469 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:47.699469 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.699555 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.699607 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f7a6a88-1728-49a9-a02a-3f66d4a19934-ca-trust-extracted\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.699656 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.699602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrrl6\" (UniqueName: \"kubernetes.io/projected/7c89560e-578e-4ca8-bb28-9608c190c546-kube-api-access-hrrl6\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.700439 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.700416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-trusted-ca\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.700551 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.700518 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-certificates\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.700919 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.700901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c89560e-578e-4ca8-bb28-9608c190c546-config-volume\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.701083 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:47.701067 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:47.701159 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:47.701122 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert podName:ea808e89-1697-4235-8c42-8202cc97fef9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.201105451 +0000 UTC m=+34.155468245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert") pod "ingress-canary-wm9mw" (UID: "ea808e89-1697-4235-8c42-8202cc97fef9") : secret "canary-serving-cert" not found Apr 22 19:58:47.701223 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:47.701208 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:47.701275 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:47.701264 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls podName:7c89560e-578e-4ca8-bb28-9608c190c546 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.201249072 +0000 UTC m=+34.155611852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls") pod "dns-default-4sbb4" (UID: "7c89560e-578e-4ca8-bb28-9608c190c546") : secret "dns-default-metrics-tls" not found Apr 22 19:58:47.701341 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:47.701331 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:47.701377 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:47.701343 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dffdb89c7-6k4v2: secret "image-registry-tls" not found Apr 22 19:58:47.701413 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:47.701376 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls podName:5f7a6a88-1728-49a9-a02a-3f66d4a19934 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.201364666 +0000 UTC m=+34.155727448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls") pod "image-registry-dffdb89c7-6k4v2" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934") : secret "image-registry-tls" not found Apr 22 19:58:47.701668 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.701649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f7a6a88-1728-49a9-a02a-3f66d4a19934-ca-trust-extracted\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.701724 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.701708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c89560e-578e-4ca8-bb28-9608c190c546-tmp-dir\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:47.705006 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.704403 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-installation-pull-secrets\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.705006 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.704448 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-image-registry-private-configuration\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.711942 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.711892 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nx6s\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-kube-api-access-5nx6s\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.712444 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.712420 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t87n\" (UniqueName: \"kubernetes.io/projected/ea808e89-1697-4235-8c42-8202cc97fef9-kube-api-access-8t87n\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:47.712608 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.712426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-bound-sa-token\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:47.713473 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:47.713443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrrl6\" (UniqueName: \"kubernetes.io/projected/7c89560e-578e-4ca8-bb28-9608c190c546-kube-api-access-hrrl6\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:48.202852 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.202809 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:48.203089 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.202863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:48.203089 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.202902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:48.203089 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.202965 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:48.203089 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.203012 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:48.203089 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.203019 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:48.203089 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.203026 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dffdb89c7-6k4v2: secret "image-registry-tls" not found Apr 22 19:58:48.203089 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.203053 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert podName:ea808e89-1697-4235-8c42-8202cc97fef9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:49.203036655 +0000 UTC m=+35.157399434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert") pod "ingress-canary-wm9mw" (UID: "ea808e89-1697-4235-8c42-8202cc97fef9") : secret "canary-serving-cert" not found Apr 22 19:58:48.203089 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.203083 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls podName:7c89560e-578e-4ca8-bb28-9608c190c546 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:49.203069338 +0000 UTC m=+35.157432121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls") pod "dns-default-4sbb4" (UID: "7c89560e-578e-4ca8-bb28-9608c190c546") : secret "dns-default-metrics-tls" not found Apr 22 19:58:48.203472 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.203111 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls podName:5f7a6a88-1728-49a9-a02a-3f66d4a19934 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:49.203101766 +0000 UTC m=+35.157464553 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls") pod "image-registry-dffdb89c7-6k4v2" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934") : secret "image-registry-tls" not found Apr 22 19:58:48.304204 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.304171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:48.304363 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.304215 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:58:48.304363 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.304323 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:58:48.304465 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:48.304384 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs podName:56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:20.304364707 +0000 UTC m=+66.258727502 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs") pod "network-metrics-daemon-p2gjc" (UID: "56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728") : secret "metrics-daemon-secret" not found Apr 22 19:58:48.306803 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.306778 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0402063-8f80-4f0b-8247-b3bd2ae51e51-original-pull-secret\") pod \"global-pull-secret-syncer-vlm4l\" (UID: \"d0402063-8f80-4f0b-8247-b3bd2ae51e51\") " pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:48.308680 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.308652 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vlm4l" Apr 22 19:58:48.405447 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.405412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:48.407873 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.407855 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx55h\" (UniqueName: \"kubernetes.io/projected/ebb887c1-8ca2-4d72-9902-320466d0dce7-kube-api-access-xx55h\") pod \"network-check-target-64jvm\" (UID: \"ebb887c1-8ca2-4d72-9902-320466d0dce7\") " pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:48.584234 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.584065 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vlm4l"] Apr 22 19:58:48.587549 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:48.587513 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0402063_8f80_4f0b_8247_b3bd2ae51e51.slice/crio-aed80c0c81889ea39c43259e7f98a9aa6b1239db4ba3e95ffd41825d1927cc8a WatchSource:0}: Error finding container aed80c0c81889ea39c43259e7f98a9aa6b1239db4ba3e95ffd41825d1927cc8a: Status 404 returned error can't find the container with id aed80c0c81889ea39c43259e7f98a9aa6b1239db4ba3e95ffd41825d1927cc8a Apr 22 19:58:48.634656 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.634635 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:48.865470 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.865439 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-64jvm"] Apr 22 19:58:48.869704 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:48.869671 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebb887c1_8ca2_4d72_9902_320466d0dce7.slice/crio-6a2ce642d255dd0dc9a7ffef58d08870b0bf0da240ea628d18d54c55a5ae4941 WatchSource:0}: Error finding container 6a2ce642d255dd0dc9a7ffef58d08870b0bf0da240ea628d18d54c55a5ae4941: Status 404 returned error can't find the container with id 6a2ce642d255dd0dc9a7ffef58d08870b0bf0da240ea628d18d54c55a5ae4941 Apr 22 19:58:48.894914 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.894885 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-64jvm" event={"ID":"ebb887c1-8ca2-4d72-9902-320466d0dce7","Type":"ContainerStarted","Data":"6a2ce642d255dd0dc9a7ffef58d08870b0bf0da240ea628d18d54c55a5ae4941"} Apr 22 19:58:48.895714 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:48.895693 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vlm4l" event={"ID":"d0402063-8f80-4f0b-8247-b3bd2ae51e51","Type":"ContainerStarted","Data":"aed80c0c81889ea39c43259e7f98a9aa6b1239db4ba3e95ffd41825d1927cc8a"} Apr 22 19:58:49.211866 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:49.211833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:49.211866 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:49.211873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:49.212081 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:49.211898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:49.212081 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:49.212008 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:49.212081 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:49.212016 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:49.212081 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:49.212034 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dffdb89c7-6k4v2: secret "image-registry-tls" not found Apr 22 19:58:49.212081 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:49.212010 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:49.212081 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:49.212074 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls podName:7c89560e-578e-4ca8-bb28-9608c190c546 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:51.212054032 +0000 UTC m=+37.166416811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls") pod "dns-default-4sbb4" (UID: "7c89560e-578e-4ca8-bb28-9608c190c546") : secret "dns-default-metrics-tls" not found Apr 22 19:58:49.212360 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:49.212090 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls podName:5f7a6a88-1728-49a9-a02a-3f66d4a19934 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:51.212081403 +0000 UTC m=+37.166444184 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls") pod "image-registry-dffdb89c7-6k4v2" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934") : secret "image-registry-tls" not found Apr 22 19:58:49.212360 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:49.212102 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert podName:ea808e89-1697-4235-8c42-8202cc97fef9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:51.212095732 +0000 UTC m=+37.166458511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert") pod "ingress-canary-wm9mw" (UID: "ea808e89-1697-4235-8c42-8202cc97fef9") : secret "canary-serving-cert" not found Apr 22 19:58:49.901652 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:49.901612 2579 generic.go:358] "Generic (PLEG): container finished" podID="f5b56a04-ef22-4b41-8aa2-34438e2003fe" containerID="f022dcceffc77707fcae29764f23786789b51c26baba43f60a11dfdb1da90078" exitCode=0 Apr 22 19:58:49.902058 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:49.901682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerDied","Data":"f022dcceffc77707fcae29764f23786789b51c26baba43f60a11dfdb1da90078"} Apr 22 19:58:50.907152 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:50.907098 2579 generic.go:358] "Generic (PLEG): container finished" podID="f5b56a04-ef22-4b41-8aa2-34438e2003fe" containerID="4c9750383c7c9f266131de486e840c2d38a57ee7051d8c283ce8b3b16c27fed1" exitCode=0 Apr 22 19:58:50.907579 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:50.907186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerDied","Data":"4c9750383c7c9f266131de486e840c2d38a57ee7051d8c283ce8b3b16c27fed1"} Apr 22 19:58:51.231819 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:51.231720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:51.231819 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:51.231775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:51.231819 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:51.231810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:51.232075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:51.231887 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:51.232075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:51.231952 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:51.232075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:51.231959 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:51.232075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:51.231978 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dffdb89c7-6k4v2: secret "image-registry-tls" not found Apr 22 19:58:51.232075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:51.231967 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert podName:ea808e89-1697-4235-8c42-8202cc97fef9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:55.231945448 +0000 UTC m=+41.186308230 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert") pod "ingress-canary-wm9mw" (UID: "ea808e89-1697-4235-8c42-8202cc97fef9") : secret "canary-serving-cert" not found Apr 22 19:58:51.232075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:51.232025 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls podName:7c89560e-578e-4ca8-bb28-9608c190c546 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:55.232008474 +0000 UTC m=+41.186371255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls") pod "dns-default-4sbb4" (UID: "7c89560e-578e-4ca8-bb28-9608c190c546") : secret "dns-default-metrics-tls" not found Apr 22 19:58:51.232075 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:51.232044 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls podName:5f7a6a88-1728-49a9-a02a-3f66d4a19934 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:55.23203565 +0000 UTC m=+41.186398432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls") pod "image-registry-dffdb89c7-6k4v2" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934") : secret "image-registry-tls" not found Apr 22 19:58:53.913384 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:53.913347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-64jvm" event={"ID":"ebb887c1-8ca2-4d72-9902-320466d0dce7","Type":"ContainerStarted","Data":"317d5fb3a691a66bd4184f2c5d54821467274bc118f8ca7279616403250a0848"} Apr 22 19:58:53.913767 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:53.913510 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:58:53.914735 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:53.914711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vlm4l" event={"ID":"d0402063-8f80-4f0b-8247-b3bd2ae51e51","Type":"ContainerStarted","Data":"0f2f020c6251199cbe38ff0ba7e14e71239a4a8667c481f5cbc460fc01cf3e55"} Apr 22 19:58:53.917739 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:53.917674 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-86nst" event={"ID":"f5b56a04-ef22-4b41-8aa2-34438e2003fe","Type":"ContainerStarted","Data":"798f7249bdb51911e33d08476604ae3575b4f6ca9d4ec0900f2a2472e2cd5299"} Apr 22 19:58:53.928217 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:53.928168 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-64jvm" podStartSLOduration=34.099644558 podStartE2EDuration="38.928130597s" podCreationTimestamp="2026-04-22 19:58:15 +0000 UTC" firstStartedPulling="2026-04-22 19:58:48.871993652 +0000 UTC m=+34.826356437" lastFinishedPulling="2026-04-22 19:58:53.700479681 +0000 UTC m=+39.654842476" observedRunningTime="2026-04-22 19:58:53.927743122 +0000 UTC m=+39.882105924" watchObservedRunningTime="2026-04-22 19:58:53.928130597 +0000 UTC m=+39.882493400" Apr 22 19:58:53.947181 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:53.947122 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-86nst" podStartSLOduration=8.415795365 podStartE2EDuration="39.947114836s" podCreationTimestamp="2026-04-22 19:58:14 +0000 UTC" firstStartedPulling="2026-04-22 19:58:17.236678005 +0000 UTC m=+3.191040799" lastFinishedPulling="2026-04-22 19:58:48.767997478 +0000 UTC m=+34.722360270" observedRunningTime="2026-04-22 19:58:53.946307338 +0000 UTC m=+39.900670143" watchObservedRunningTime="2026-04-22 19:58:53.947114836 +0000 UTC m=+39.901477636" Apr 22 19:58:53.960510 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:53.960465 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vlm4l" podStartSLOduration=33.839198021 podStartE2EDuration="38.960451972s" podCreationTimestamp="2026-04-22 19:58:15 +0000 UTC" firstStartedPulling="2026-04-22 19:58:48.590467402 +0000 UTC m=+34.544830195" lastFinishedPulling="2026-04-22 19:58:53.711721352 +0000 UTC m=+39.666084146" observedRunningTime="2026-04-22 19:58:53.959435547 +0000 UTC m=+39.913798348" watchObservedRunningTime="2026-04-22 19:58:53.960451972 +0000 UTC m=+39.914814774" Apr 22 19:58:55.266515 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.266480 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:58:55.266515 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.266520 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:58:55.266947 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.266543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:58:55.266947 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:55.266634 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:55.266947 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:55.266645 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:55.266947 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:55.266661 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:55.266947 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:55.266674 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dffdb89c7-6k4v2: secret "image-registry-tls" not found Apr 22 19:58:55.266947 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:55.266695 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert podName:ea808e89-1697-4235-8c42-8202cc97fef9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:03.26667999 +0000 UTC m=+49.221042769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert") pod "ingress-canary-wm9mw" (UID: "ea808e89-1697-4235-8c42-8202cc97fef9") : secret "canary-serving-cert" not found Apr 22 19:58:55.266947 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:55.266710 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls podName:7c89560e-578e-4ca8-bb28-9608c190c546 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:03.266703474 +0000 UTC m=+49.221066252 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls") pod "dns-default-4sbb4" (UID: "7c89560e-578e-4ca8-bb28-9608c190c546") : secret "dns-default-metrics-tls" not found Apr 22 19:58:55.266947 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:58:55.266748 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls podName:5f7a6a88-1728-49a9-a02a-3f66d4a19934 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:03.266735483 +0000 UTC m=+49.221098262 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls") pod "image-registry-dffdb89c7-6k4v2" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934") : secret "image-registry-tls" not found Apr 22 19:58:55.478215 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.478177 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9"] Apr 22 19:58:55.503555 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.503521 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl"] Apr 22 19:58:55.503686 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.503555 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" Apr 22 19:58:55.505968 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.505949 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-ztfr6\"" Apr 22 19:58:55.506077 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.505949 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:58:55.506077 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.505949 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:58:55.506268 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.506252 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:58:55.506614 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.506601 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 19:58:55.515838 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.515816 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9"] Apr 22 19:58:55.515940 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.515842 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl"] Apr 22 19:58:55.515940 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.515856 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8"] Apr 22 19:58:55.516042 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.515943 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.518352 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.518296 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 19:58:55.528212 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.528194 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8"] Apr 22 19:58:55.528303 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.528290 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.530543 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.530529 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 19:58:55.530713 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.530689 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 19:58:55.530798 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.530717 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 19:58:55.530798 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.530723 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 19:58:55.669017 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.668983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b970045a-8b6f-4766-822f-88dc082e7124-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86df98cf65-ghmv9\" (UID: \"b970045a-8b6f-4766-822f-88dc082e7124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" Apr 22 19:58:55.669183 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669051 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msbq4\" (UniqueName: \"kubernetes.io/projected/b970045a-8b6f-4766-822f-88dc082e7124-kube-api-access-msbq4\") pod \"managed-serviceaccount-addon-agent-86df98cf65-ghmv9\" (UID: \"b970045a-8b6f-4766-822f-88dc082e7124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" Apr 22 19:58:55.669183 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-ca\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.669183 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669126 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-hub\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.669183 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669175 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqf9v\" (UniqueName: \"kubernetes.io/projected/70e580ec-d1ea-46db-8b30-9c2712ac7d32-kube-api-access-xqf9v\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.669399 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669200 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.669399 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba511b60-4475-429c-aa35-b302aa94ada8-tmp\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.669399 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ba511b60-4475-429c-aa35-b302aa94ada8-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.669399 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669273 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9t7k\" (UniqueName: \"kubernetes.io/projected/ba511b60-4475-429c-aa35-b302aa94ada8-kube-api-access-r9t7k\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.669399 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669304 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70e580ec-d1ea-46db-8b30-9c2712ac7d32-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.669399 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.669335 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.769812 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.769748 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.769812 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.769776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba511b60-4475-429c-aa35-b302aa94ada8-tmp\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.769812 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.769794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ba511b60-4475-429c-aa35-b302aa94ada8-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.770038 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.769815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9t7k\" (UniqueName: \"kubernetes.io/projected/ba511b60-4475-429c-aa35-b302aa94ada8-kube-api-access-r9t7k\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.770038 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.769859 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70e580ec-d1ea-46db-8b30-9c2712ac7d32-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.770038 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.769892 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.770038 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.769915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b970045a-8b6f-4766-822f-88dc082e7124-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86df98cf65-ghmv9\" (UID: \"b970045a-8b6f-4766-822f-88dc082e7124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" Apr 22 19:58:55.770038 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.769967 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msbq4\" (UniqueName: \"kubernetes.io/projected/b970045a-8b6f-4766-822f-88dc082e7124-kube-api-access-msbq4\") pod \"managed-serviceaccount-addon-agent-86df98cf65-ghmv9\" (UID: \"b970045a-8b6f-4766-822f-88dc082e7124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" Apr 22 19:58:55.770038 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.770002 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-ca\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.770038 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.770030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-hub\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.770377 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.770054 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqf9v\" (UniqueName: \"kubernetes.io/projected/70e580ec-d1ea-46db-8b30-9c2712ac7d32-kube-api-access-xqf9v\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.770930 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.770693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70e580ec-d1ea-46db-8b30-9c2712ac7d32-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.770930 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.770711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba511b60-4475-429c-aa35-b302aa94ada8-tmp\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.774414 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.774392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-ca\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.774508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.774466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.774508 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.774473 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-hub\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.774576 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.774518 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b970045a-8b6f-4766-822f-88dc082e7124-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86df98cf65-ghmv9\" (UID: \"b970045a-8b6f-4766-822f-88dc082e7124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" Apr 22 19:58:55.774576 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.774518 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70e580ec-d1ea-46db-8b30-9c2712ac7d32-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.774576 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.774558 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ba511b60-4475-429c-aa35-b302aa94ada8-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.777451 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.777428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9t7k\" (UniqueName: \"kubernetes.io/projected/ba511b60-4475-429c-aa35-b302aa94ada8-kube-api-access-r9t7k\") pod \"klusterlet-addon-workmgr-5b645f86fb-tzhnl\" (UID: \"ba511b60-4475-429c-aa35-b302aa94ada8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.777775 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.777754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqf9v\" (UniqueName: \"kubernetes.io/projected/70e580ec-d1ea-46db-8b30-9c2712ac7d32-kube-api-access-xqf9v\") pod \"cluster-proxy-proxy-agent-85d87b75ff-whtl8\" (UID: \"70e580ec-d1ea-46db-8b30-9c2712ac7d32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.778154 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.778118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msbq4\" (UniqueName: \"kubernetes.io/projected/b970045a-8b6f-4766-822f-88dc082e7124-kube-api-access-msbq4\") pod \"managed-serviceaccount-addon-agent-86df98cf65-ghmv9\" (UID: \"b970045a-8b6f-4766-822f-88dc082e7124\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" Apr 22 19:58:55.825008 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.824988 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" Apr 22 19:58:55.834755 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.834733 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:58:55.839437 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.839415 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 19:58:55.952993 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:55.952969 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9"] Apr 22 19:58:55.959936 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:55.959909 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb970045a_8b6f_4766_822f_88dc082e7124.slice/crio-778ebc511cdd74218a800cad4767c1d79a0a114e396aae0029be5c69df8458b5 WatchSource:0}: Error finding container 778ebc511cdd74218a800cad4767c1d79a0a114e396aae0029be5c69df8458b5: Status 404 returned error can't find the container with id 778ebc511cdd74218a800cad4767c1d79a0a114e396aae0029be5c69df8458b5 Apr 22 19:58:56.177571 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:56.177542 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl"] Apr 22 19:58:56.180798 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:56.180774 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba511b60_4475_429c_aa35_b302aa94ada8.slice/crio-2527e3c8039f340afc8100f43d59e898dbfc8cb6de0b774ff3a437f8e8585cf5 WatchSource:0}: Error finding container 2527e3c8039f340afc8100f43d59e898dbfc8cb6de0b774ff3a437f8e8585cf5: Status 404 returned error can't find the container with id 2527e3c8039f340afc8100f43d59e898dbfc8cb6de0b774ff3a437f8e8585cf5 Apr 22 19:58:56.180938 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:56.180922 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8"] Apr 22 19:58:56.184048 ip-10-0-141-46 kubenswrapper[2579]: W0422 19:58:56.184028 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70e580ec_d1ea_46db_8b30_9c2712ac7d32.slice/crio-6ac14041f5a104e80fc4a47c2e8d40e2eaad86bd485b1c73223c060815cfae76 WatchSource:0}: Error finding container 6ac14041f5a104e80fc4a47c2e8d40e2eaad86bd485b1c73223c060815cfae76: Status 404 returned error can't find the container with id 6ac14041f5a104e80fc4a47c2e8d40e2eaad86bd485b1c73223c060815cfae76 Apr 22 19:58:56.926625 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:56.926580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" event={"ID":"b970045a-8b6f-4766-822f-88dc082e7124","Type":"ContainerStarted","Data":"778ebc511cdd74218a800cad4767c1d79a0a114e396aae0029be5c69df8458b5"} Apr 22 19:58:56.927852 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:56.927821 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" event={"ID":"70e580ec-d1ea-46db-8b30-9c2712ac7d32","Type":"ContainerStarted","Data":"6ac14041f5a104e80fc4a47c2e8d40e2eaad86bd485b1c73223c060815cfae76"} Apr 22 19:58:56.929361 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:58:56.929324 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" event={"ID":"ba511b60-4475-429c-aa35-b302aa94ada8","Type":"ContainerStarted","Data":"2527e3c8039f340afc8100f43d59e898dbfc8cb6de0b774ff3a437f8e8585cf5"} Apr 22 19:59:02.944287 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:02.944255 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" event={"ID":"70e580ec-d1ea-46db-8b30-9c2712ac7d32","Type":"ContainerStarted","Data":"d8cdaf44c2278731af7fe1073261b059559b3516638945ed202e29a72e5f1cbf"} Apr 22 19:59:02.945685 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:02.945649 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" event={"ID":"ba511b60-4475-429c-aa35-b302aa94ada8","Type":"ContainerStarted","Data":"64c7c3517581b4870dd03c5ace98f30f7cfda75ed32044317e50a37fa822d607"} Apr 22 19:59:02.945883 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:02.945862 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:59:02.947104 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:02.947079 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" event={"ID":"b970045a-8b6f-4766-822f-88dc082e7124","Type":"ContainerStarted","Data":"79b81f3b679459c54644e3e41bd2c68695d9f04c77522af4aad937bc3bfbf2bf"} Apr 22 19:59:02.947618 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:02.947600 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" Apr 22 19:59:02.965221 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:02.965178 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b645f86fb-tzhnl" podStartSLOduration=1.737025315 podStartE2EDuration="7.965166491s" podCreationTimestamp="2026-04-22 19:58:55 +0000 UTC" firstStartedPulling="2026-04-22 19:58:56.18288648 +0000 UTC m=+42.137249259" lastFinishedPulling="2026-04-22 19:59:02.411027653 +0000 UTC m=+48.365390435" observedRunningTime="2026-04-22 19:59:02.964288426 +0000 UTC m=+48.918651226" watchObservedRunningTime="2026-04-22 19:59:02.965166491 +0000 UTC m=+48.919529270" Apr 22 19:59:02.978047 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:02.978008 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86df98cf65-ghmv9" podStartSLOduration=1.543056348 podStartE2EDuration="7.977997725s" podCreationTimestamp="2026-04-22 19:58:55 +0000 UTC" firstStartedPulling="2026-04-22 19:58:55.961842842 +0000 UTC m=+41.916205620" lastFinishedPulling="2026-04-22 19:59:02.396784215 +0000 UTC m=+48.351146997" observedRunningTime="2026-04-22 19:59:02.977681768 +0000 UTC m=+48.932044594" watchObservedRunningTime="2026-04-22 19:59:02.977997725 +0000 UTC m=+48.932360526" Apr 22 19:59:03.335285 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:03.335196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:59:03.335285 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:03.335237 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:59:03.335285 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:03.335256 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:59:03.335554 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:03.335362 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:59:03.335554 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:03.335364 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:03.335554 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:03.335418 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:03.335554 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:03.335440 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert podName:ea808e89-1697-4235-8c42-8202cc97fef9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:19.335421158 +0000 UTC m=+65.289783955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert") pod "ingress-canary-wm9mw" (UID: "ea808e89-1697-4235-8c42-8202cc97fef9") : secret "canary-serving-cert" not found Apr 22 19:59:03.335554 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:03.335460 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls podName:7c89560e-578e-4ca8-bb28-9608c190c546 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:19.335447874 +0000 UTC m=+65.289810653 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls") pod "dns-default-4sbb4" (UID: "7c89560e-578e-4ca8-bb28-9608c190c546") : secret "dns-default-metrics-tls" not found Apr 22 19:59:03.335554 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:03.335377 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dffdb89c7-6k4v2: secret "image-registry-tls" not found Apr 22 19:59:03.335554 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:03.335486 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls podName:5f7a6a88-1728-49a9-a02a-3f66d4a19934 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:19.335478766 +0000 UTC m=+65.289841544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls") pod "image-registry-dffdb89c7-6k4v2" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934") : secret "image-registry-tls" not found Apr 22 19:59:05.954684 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:05.954649 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" event={"ID":"70e580ec-d1ea-46db-8b30-9c2712ac7d32","Type":"ContainerStarted","Data":"e8ae179fe8ec14bcc380c5f8aac76c31aa2ace31e86161ddd9344bc4eccbad8c"} Apr 22 19:59:05.954684 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:05.954683 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" event={"ID":"70e580ec-d1ea-46db-8b30-9c2712ac7d32","Type":"ContainerStarted","Data":"b9a673ad608021ceadb087751369e63b43a85a4e0112f22eef859fabc93fc0e9"} Apr 22 19:59:05.974628 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:05.974588 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" podStartSLOduration=1.847005288 podStartE2EDuration="10.974574312s" podCreationTimestamp="2026-04-22 19:58:55 +0000 UTC" firstStartedPulling="2026-04-22 19:58:56.185732092 +0000 UTC m=+42.140094874" lastFinishedPulling="2026-04-22 19:59:05.313301114 +0000 UTC m=+51.267663898" observedRunningTime="2026-04-22 19:59:05.973231367 +0000 UTC m=+51.927594167" watchObservedRunningTime="2026-04-22 19:59:05.974574312 +0000 UTC m=+51.928937113" Apr 22 19:59:11.892604 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:11.892573 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cw48v" Apr 22 19:59:19.345562 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:19.345522 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:59:19.345562 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:19.345563 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:59:19.346022 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:19.345595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:59:19.346022 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:19.345674 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:19.346022 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:19.345724 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:19.346022 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:19.345731 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:59:19.346022 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:19.345753 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dffdb89c7-6k4v2: secret "image-registry-tls" not found Apr 22 19:59:19.346022 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:19.345753 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert podName:ea808e89-1697-4235-8c42-8202cc97fef9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:51.345732914 +0000 UTC m=+97.300095708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert") pod "ingress-canary-wm9mw" (UID: "ea808e89-1697-4235-8c42-8202cc97fef9") : secret "canary-serving-cert" not found Apr 22 19:59:19.346022 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:19.345800 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls podName:7c89560e-578e-4ca8-bb28-9608c190c546 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:51.345790402 +0000 UTC m=+97.300153185 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls") pod "dns-default-4sbb4" (UID: "7c89560e-578e-4ca8-bb28-9608c190c546") : secret "dns-default-metrics-tls" not found Apr 22 19:59:19.346022 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:19.345814 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls podName:5f7a6a88-1728-49a9-a02a-3f66d4a19934 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:51.345807826 +0000 UTC m=+97.300170605 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls") pod "image-registry-dffdb89c7-6k4v2" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934") : secret "image-registry-tls" not found Apr 22 19:59:20.353662 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:20.353628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 19:59:20.354031 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:20.353786 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:59:20.354031 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:20.353853 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs podName:56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:24.353840305 +0000 UTC m=+130.308203084 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs") pod "network-metrics-daemon-p2gjc" (UID: "56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728") : secret "metrics-daemon-secret" not found Apr 22 19:59:24.921603 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:24.921565 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-64jvm" Apr 22 19:59:48.648090 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:48.647978 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vk6km_603cfe2d-1c53-4bc8-acc0-b1d1751c2817/dns-node-resolver/0.log" Apr 22 19:59:49.847275 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:49.847245 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xzq74_799125ad-367b-42aa-a2e1-222e89529bac/node-ca/0.log" Apr 22 19:59:51.357575 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:51.357544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:51.357581 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: I0422 19:59:51.357607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") pod \"image-registry-dffdb89c7-6k4v2\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:51.357681 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:51.357708 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:51.357733 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:51.357748 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-dffdb89c7-6k4v2: secret "image-registry-tls" not found Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:51.357761 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert podName:ea808e89-1697-4235-8c42-8202cc97fef9 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:55.357745061 +0000 UTC m=+161.312107840 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert") pod "ingress-canary-wm9mw" (UID: "ea808e89-1697-4235-8c42-8202cc97fef9") : secret "canary-serving-cert" not found Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:51.357775 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls podName:7c89560e-578e-4ca8-bb28-9608c190c546 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:55.357769477 +0000 UTC m=+161.312132256 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls") pod "dns-default-4sbb4" (UID: "7c89560e-578e-4ca8-bb28-9608c190c546") : secret "dns-default-metrics-tls" not found Apr 22 19:59:51.357909 ip-10-0-141-46 kubenswrapper[2579]: E0422 19:59:51.357787 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls podName:5f7a6a88-1728-49a9-a02a-3f66d4a19934 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:55.357780363 +0000 UTC m=+161.312143141 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls") pod "image-registry-dffdb89c7-6k4v2" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934") : secret "image-registry-tls" not found Apr 22 20:00:08.564932 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.564896 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-b8fc9"] Apr 22 20:00:08.566948 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.566930 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.570465 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.570438 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 20:00:08.570465 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.570450 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rfw8m\"" Apr 22 20:00:08.570608 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.570438 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 20:00:08.570985 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.570971 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 20:00:08.576401 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.576381 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 20:00:08.587745 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.587724 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b8fc9"] Apr 22 20:00:08.687053 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.687018 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea34c500-4ae8-491f-a03b-da5186e37c7d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.687053 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.687060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s9dl\" (UniqueName: \"kubernetes.io/projected/ea34c500-4ae8-491f-a03b-da5186e37c7d-kube-api-access-5s9dl\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.687279 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.687087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea34c500-4ae8-491f-a03b-da5186e37c7d-crio-socket\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.687279 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.687106 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea34c500-4ae8-491f-a03b-da5186e37c7d-data-volume\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.687279 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.687204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea34c500-4ae8-491f-a03b-da5186e37c7d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.788380 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.788347 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s9dl\" (UniqueName: \"kubernetes.io/projected/ea34c500-4ae8-491f-a03b-da5186e37c7d-kube-api-access-5s9dl\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.788546 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.788388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea34c500-4ae8-491f-a03b-da5186e37c7d-crio-socket\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.788546 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.788407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea34c500-4ae8-491f-a03b-da5186e37c7d-data-volume\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.788546 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.788427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea34c500-4ae8-491f-a03b-da5186e37c7d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.788546 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.788487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ea34c500-4ae8-491f-a03b-da5186e37c7d-crio-socket\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.788546 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.788503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea34c500-4ae8-491f-a03b-da5186e37c7d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.788750 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.788727 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea34c500-4ae8-491f-a03b-da5186e37c7d-data-volume\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.789027 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.789009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ea34c500-4ae8-491f-a03b-da5186e37c7d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.790909 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.790888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ea34c500-4ae8-491f-a03b-da5186e37c7d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.796675 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.796656 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s9dl\" (UniqueName: \"kubernetes.io/projected/ea34c500-4ae8-491f-a03b-da5186e37c7d-kube-api-access-5s9dl\") pod \"insights-runtime-extractor-b8fc9\" (UID: \"ea34c500-4ae8-491f-a03b-da5186e37c7d\") " pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.875735 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.875704 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b8fc9" Apr 22 20:00:08.991458 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:08.991429 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b8fc9"] Apr 22 20:00:08.994562 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:00:08.994532 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea34c500_4ae8_491f_a03b_da5186e37c7d.slice/crio-1150b5fca9695e267a5ba9e86c1ba34778953a804cad4d1a8e8d18a04ac1337a WatchSource:0}: Error finding container 1150b5fca9695e267a5ba9e86c1ba34778953a804cad4d1a8e8d18a04ac1337a: Status 404 returned error can't find the container with id 1150b5fca9695e267a5ba9e86c1ba34778953a804cad4d1a8e8d18a04ac1337a Apr 22 20:00:09.102254 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:09.102221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b8fc9" event={"ID":"ea34c500-4ae8-491f-a03b-da5186e37c7d","Type":"ContainerStarted","Data":"5e1f4e6910a9e8b0cac89b0d561242cb5ff06d0f04a09123abb29c6a7d233059"} Apr 22 20:00:09.102254 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:09.102261 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b8fc9" event={"ID":"ea34c500-4ae8-491f-a03b-da5186e37c7d","Type":"ContainerStarted","Data":"1150b5fca9695e267a5ba9e86c1ba34778953a804cad4d1a8e8d18a04ac1337a"} Apr 22 20:00:10.106316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:10.106235 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b8fc9" event={"ID":"ea34c500-4ae8-491f-a03b-da5186e37c7d","Type":"ContainerStarted","Data":"8efa6a35d32a7f8f9c6a007885abf5d174f5e3cb35a2a217db69f5b5da00465d"} Apr 22 20:00:12.113520 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:12.113474 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b8fc9" event={"ID":"ea34c500-4ae8-491f-a03b-da5186e37c7d","Type":"ContainerStarted","Data":"d3092bdfc7dbdecee08b9440ee4e07c4552db26573d9381f681ddd729a61633c"} Apr 22 20:00:12.135076 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:12.135027 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-b8fc9" podStartSLOduration=1.834461282 podStartE2EDuration="4.134994554s" podCreationTimestamp="2026-04-22 20:00:08 +0000 UTC" firstStartedPulling="2026-04-22 20:00:09.056541979 +0000 UTC m=+115.010904759" lastFinishedPulling="2026-04-22 20:00:11.357075248 +0000 UTC m=+117.311438031" observedRunningTime="2026-04-22 20:00:12.13444192 +0000 UTC m=+118.088804745" watchObservedRunningTime="2026-04-22 20:00:12.134994554 +0000 UTC m=+118.089357388" Apr 22 20:00:21.109526 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.109487 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sk4xd"] Apr 22 20:00:21.111603 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.111585 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.114015 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.113993 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 20:00:21.114502 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.114483 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 20:00:21.114611 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.114557 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 20:00:21.114611 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.114573 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 20:00:21.114611 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.114580 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 20:00:21.115195 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.115182 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kl442\"" Apr 22 20:00:21.115248 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.115205 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 20:00:21.188787 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.188764 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-root\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.188787 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.188789 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw5bn\" (UniqueName: \"kubernetes.io/projected/5675d165-27d8-4220-9ec2-40aecf150447-kube-api-access-kw5bn\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.188932 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.188826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-accelerators-collector-config\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.188932 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.188866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.188932 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.188904 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5675d165-27d8-4220-9ec2-40aecf150447-metrics-client-ca\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.188932 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.188922 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-textfile\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.189058 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.188964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-sys\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.189058 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.188990 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.189058 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.189007 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-wtmp\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290306 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290278 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-root\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290306 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kw5bn\" (UniqueName: \"kubernetes.io/projected/5675d165-27d8-4220-9ec2-40aecf150447-kube-api-access-kw5bn\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290473 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290350 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-accelerators-collector-config\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290473 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290473 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-root\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290473 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5675d165-27d8-4220-9ec2-40aecf150447-metrics-client-ca\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290473 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-textfile\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290736 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-sys\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290736 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290736 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290582 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-wtmp\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290736 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:00:21.290709 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 20:00:21.290736 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-wtmp\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.290966 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:00:21.290791 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls podName:5675d165-27d8-4220-9ec2-40aecf150447 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:21.790769185 +0000 UTC m=+127.745131971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls") pod "node-exporter-sk4xd" (UID: "5675d165-27d8-4220-9ec2-40aecf150447") : secret "node-exporter-tls" not found Apr 22 20:00:21.290966 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-textfile\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.291099 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.290966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5675d165-27d8-4220-9ec2-40aecf150447-metrics-client-ca\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.291099 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.291065 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-accelerators-collector-config\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.291223 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.291169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5675d165-27d8-4220-9ec2-40aecf150447-sys\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.293239 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.293218 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.299392 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.299367 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw5bn\" (UniqueName: \"kubernetes.io/projected/5675d165-27d8-4220-9ec2-40aecf150447-kube-api-access-kw5bn\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.793392 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:21.793350 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:21.793559 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:00:21.793474 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 20:00:21.793559 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:00:21.793545 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls podName:5675d165-27d8-4220-9ec2-40aecf150447 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:22.793528099 +0000 UTC m=+128.747890878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls") pod "node-exporter-sk4xd" (UID: "5675d165-27d8-4220-9ec2-40aecf150447") : secret "node-exporter-tls" not found Apr 22 20:00:22.801405 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:22.801371 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:22.803879 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:22.803856 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5675d165-27d8-4220-9ec2-40aecf150447-node-exporter-tls\") pod \"node-exporter-sk4xd\" (UID: \"5675d165-27d8-4220-9ec2-40aecf150447\") " pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:22.920280 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:22.920251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sk4xd" Apr 22 20:00:22.928464 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:00:22.928437 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5675d165_27d8_4220_9ec2_40aecf150447.slice/crio-e17ef69f3c9832bf11cd749a93b872e01ccd7e88fc04f8107d4b297f382efec5 WatchSource:0}: Error finding container e17ef69f3c9832bf11cd749a93b872e01ccd7e88fc04f8107d4b297f382efec5: Status 404 returned error can't find the container with id e17ef69f3c9832bf11cd749a93b872e01ccd7e88fc04f8107d4b297f382efec5 Apr 22 20:00:23.142694 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:23.142657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sk4xd" event={"ID":"5675d165-27d8-4220-9ec2-40aecf150447","Type":"ContainerStarted","Data":"e17ef69f3c9832bf11cd749a93b872e01ccd7e88fc04f8107d4b297f382efec5"} Apr 22 20:00:24.148008 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:24.147974 2579 generic.go:358] "Generic (PLEG): container finished" podID="5675d165-27d8-4220-9ec2-40aecf150447" containerID="dcb5cb6ae93e50adf018c82890e6b8c03ce0378105fb84a7aa62b8d4990a0025" exitCode=0 Apr 22 20:00:24.148364 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:24.148035 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sk4xd" event={"ID":"5675d165-27d8-4220-9ec2-40aecf150447","Type":"ContainerDied","Data":"dcb5cb6ae93e50adf018c82890e6b8c03ce0378105fb84a7aa62b8d4990a0025"} Apr 22 20:00:24.411529 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:24.411457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 20:00:24.414344 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:24.414321 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728-metrics-certs\") pod \"network-metrics-daemon-p2gjc\" (UID: \"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728\") " pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 20:00:24.630239 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:24.630215 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vnzkf\"" Apr 22 20:00:24.638944 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:24.638927 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2gjc" Apr 22 20:00:24.757721 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:24.757684 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p2gjc"] Apr 22 20:00:24.760207 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:00:24.760178 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56cb62eb_ff2c_4ce8_bbe3_bc6c8dcad728.slice/crio-cdeeff7a93f3197fdd3c47f2266e18cb49c8058136eaa1ffdc74853c7c921f40 WatchSource:0}: Error finding container cdeeff7a93f3197fdd3c47f2266e18cb49c8058136eaa1ffdc74853c7c921f40: Status 404 returned error can't find the container with id cdeeff7a93f3197fdd3c47f2266e18cb49c8058136eaa1ffdc74853c7c921f40 Apr 22 20:00:25.152563 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:25.152523 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p2gjc" event={"ID":"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728","Type":"ContainerStarted","Data":"cdeeff7a93f3197fdd3c47f2266e18cb49c8058136eaa1ffdc74853c7c921f40"} Apr 22 20:00:25.154492 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:25.154456 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sk4xd" event={"ID":"5675d165-27d8-4220-9ec2-40aecf150447","Type":"ContainerStarted","Data":"364e561919215c0a508266c761e4555042cb8c2e7ad16bd9bcf34e6fc7f4ecd3"} Apr 22 20:00:25.154492 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:25.154493 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sk4xd" event={"ID":"5675d165-27d8-4220-9ec2-40aecf150447","Type":"ContainerStarted","Data":"4b9086bf9d149e65ba010cf7a7c9795193ebdd5ffa98fe18555f507655a32787"} Apr 22 20:00:25.180869 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:25.180821 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sk4xd" podStartSLOduration=3.400604399 podStartE2EDuration="4.180803621s" podCreationTimestamp="2026-04-22 20:00:21 +0000 UTC" firstStartedPulling="2026-04-22 20:00:22.930382072 +0000 UTC m=+128.884744851" lastFinishedPulling="2026-04-22 20:00:23.710581294 +0000 UTC m=+129.664944073" observedRunningTime="2026-04-22 20:00:25.179413263 +0000 UTC m=+131.133776064" watchObservedRunningTime="2026-04-22 20:00:25.180803621 +0000 UTC m=+131.135166424" Apr 22 20:00:25.841198 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:25.841093 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" podUID="70e580ec-d1ea-46db-8b30-9c2712ac7d32" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 20:00:26.161946 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:26.161910 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p2gjc" event={"ID":"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728","Type":"ContainerStarted","Data":"59914113c843f5f791f11efe344c1446a914b145bdb97980de58a0286d395a9a"} Apr 22 20:00:26.162304 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:26.161955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p2gjc" event={"ID":"56cb62eb-ff2c-4ce8-bbe3-bc6c8dcad728","Type":"ContainerStarted","Data":"565e13fbbefaecf10eeb1f6898f69a196ab860b44bc821a17494a259319de8db"} Apr 22 20:00:26.177875 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:26.177836 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p2gjc" podStartSLOduration=131.268126865 podStartE2EDuration="2m12.177823689s" podCreationTimestamp="2026-04-22 19:58:14 +0000 UTC" firstStartedPulling="2026-04-22 20:00:24.762121814 +0000 UTC m=+130.716484593" lastFinishedPulling="2026-04-22 20:00:25.671818638 +0000 UTC m=+131.626181417" observedRunningTime="2026-04-22 20:00:26.177403985 +0000 UTC m=+132.131766787" watchObservedRunningTime="2026-04-22 20:00:26.177823689 +0000 UTC m=+132.132186487" Apr 22 20:00:30.313581 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:30.313545 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-dffdb89c7-6k4v2"] Apr 22 20:00:30.314034 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:00:30.313743 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" podUID="5f7a6a88-1728-49a9-a02a-3f66d4a19934" Apr 22 20:00:31.173629 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.173600 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 20:00:31.177898 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.177877 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 20:00:31.265680 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.265644 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-image-registry-private-configuration\") pod \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " Apr 22 20:00:31.265832 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.265704 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nx6s\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-kube-api-access-5nx6s\") pod \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " Apr 22 20:00:31.265832 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.265731 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-installation-pull-secrets\") pod \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " Apr 22 20:00:31.265832 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.265752 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f7a6a88-1728-49a9-a02a-3f66d4a19934-ca-trust-extracted\") pod \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " Apr 22 20:00:31.265832 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.265772 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-bound-sa-token\") pod \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " Apr 22 20:00:31.265832 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.265790 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-certificates\") pod \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " Apr 22 20:00:31.265832 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.265823 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-trusted-ca\") pod \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\" (UID: \"5f7a6a88-1728-49a9-a02a-3f66d4a19934\") " Apr 22 20:00:31.266348 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.266175 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f7a6a88-1728-49a9-a02a-3f66d4a19934-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5f7a6a88-1728-49a9-a02a-3f66d4a19934" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:31.266348 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.266283 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5f7a6a88-1728-49a9-a02a-3f66d4a19934" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:31.266666 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.266450 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5f7a6a88-1728-49a9-a02a-3f66d4a19934" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:31.268190 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.268161 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5f7a6a88-1728-49a9-a02a-3f66d4a19934" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.268295 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.268236 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5f7a6a88-1728-49a9-a02a-3f66d4a19934" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:31.268295 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.268274 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5f7a6a88-1728-49a9-a02a-3f66d4a19934" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.268369 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.268323 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-kube-api-access-5nx6s" (OuterVolumeSpecName: "kube-api-access-5nx6s") pod "5f7a6a88-1728-49a9-a02a-3f66d4a19934" (UID: "5f7a6a88-1728-49a9-a02a-3f66d4a19934"). InnerVolumeSpecName "kube-api-access-5nx6s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:31.366734 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.366709 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5nx6s\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-kube-api-access-5nx6s\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.367064 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.366730 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-installation-pull-secrets\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.367064 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.367018 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f7a6a88-1728-49a9-a02a-3f66d4a19934-ca-trust-extracted\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.367064 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.367047 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-bound-sa-token\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.367064 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.367064 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-certificates\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.367288 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.367089 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f7a6a88-1728-49a9-a02a-3f66d4a19934-trusted-ca\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.367288 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:31.367106 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5f7a6a88-1728-49a9-a02a-3f66d4a19934-image-registry-private-configuration\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:00:32.175643 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:32.175610 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-dffdb89c7-6k4v2" Apr 22 20:00:32.207162 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:32.207121 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-dffdb89c7-6k4v2"] Apr 22 20:00:32.210453 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:32.210425 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-dffdb89c7-6k4v2"] Apr 22 20:00:32.273495 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:32.273474 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f7a6a88-1728-49a9-a02a-3f66d4a19934-registry-tls\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:00:32.692552 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:32.692523 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f7a6a88-1728-49a9-a02a-3f66d4a19934" path="/var/lib/kubelet/pods/5f7a6a88-1728-49a9-a02a-3f66d4a19934/volumes" Apr 22 20:00:35.840392 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:35.840356 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" podUID="70e580ec-d1ea-46db-8b30-9c2712ac7d32" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 20:00:45.840360 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:45.840317 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" podUID="70e580ec-d1ea-46db-8b30-9c2712ac7d32" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 20:00:45.840813 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:45.840389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" Apr 22 20:00:45.840974 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:45.840930 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"e8ae179fe8ec14bcc380c5f8aac76c31aa2ace31e86161ddd9344bc4eccbad8c"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 20:00:45.841049 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:45.841003 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" podUID="70e580ec-d1ea-46db-8b30-9c2712ac7d32" containerName="service-proxy" containerID="cri-o://e8ae179fe8ec14bcc380c5f8aac76c31aa2ace31e86161ddd9344bc4eccbad8c" gracePeriod=30 Apr 22 20:00:46.212576 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:46.212544 2579 generic.go:358] "Generic (PLEG): container finished" podID="70e580ec-d1ea-46db-8b30-9c2712ac7d32" containerID="e8ae179fe8ec14bcc380c5f8aac76c31aa2ace31e86161ddd9344bc4eccbad8c" exitCode=2 Apr 22 20:00:46.212725 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:46.212616 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" event={"ID":"70e580ec-d1ea-46db-8b30-9c2712ac7d32","Type":"ContainerDied","Data":"e8ae179fe8ec14bcc380c5f8aac76c31aa2ace31e86161ddd9344bc4eccbad8c"} Apr 22 20:00:46.212725 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:46.212652 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85d87b75ff-whtl8" event={"ID":"70e580ec-d1ea-46db-8b30-9c2712ac7d32","Type":"ContainerStarted","Data":"f1c527f43dc8f0621a38372ab930c7d76c07e3837787ac6d30a135f29d26ca42"} Apr 22 20:00:50.457827 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:00:50.457788 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4sbb4" podUID="7c89560e-578e-4ca8-bb28-9608c190c546" Apr 22 20:00:50.474212 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:00:50.474192 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wm9mw" podUID="ea808e89-1697-4235-8c42-8202cc97fef9" Apr 22 20:00:51.224666 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:51.224637 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4sbb4" Apr 22 20:00:55.437089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:55.437046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 20:00:55.437089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:55.437091 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 20:00:55.439566 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:55.439540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c89560e-578e-4ca8-bb28-9608c190c546-metrics-tls\") pod \"dns-default-4sbb4\" (UID: \"7c89560e-578e-4ca8-bb28-9608c190c546\") " pod="openshift-dns/dns-default-4sbb4" Apr 22 20:00:55.439678 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:55.439598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea808e89-1697-4235-8c42-8202cc97fef9-cert\") pod \"ingress-canary-wm9mw\" (UID: \"ea808e89-1697-4235-8c42-8202cc97fef9\") " pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 20:00:55.727402 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:55.727323 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2xtr6\"" Apr 22 20:00:55.735958 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:55.735937 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4sbb4" Apr 22 20:00:55.855028 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:55.855007 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4sbb4"] Apr 22 20:00:55.857331 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:00:55.857299 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c89560e_578e_4ca8_bb28_9608c190c546.slice/crio-6c84dfa9acb661cce5adcc2752deb96be4967045c11142484ebe9f3e8f838aa9 WatchSource:0}: Error finding container 6c84dfa9acb661cce5adcc2752deb96be4967045c11142484ebe9f3e8f838aa9: Status 404 returned error can't find the container with id 6c84dfa9acb661cce5adcc2752deb96be4967045c11142484ebe9f3e8f838aa9 Apr 22 20:00:56.238071 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:56.238030 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4sbb4" event={"ID":"7c89560e-578e-4ca8-bb28-9608c190c546","Type":"ContainerStarted","Data":"6c84dfa9acb661cce5adcc2752deb96be4967045c11142484ebe9f3e8f838aa9"} Apr 22 20:00:58.243926 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:58.243891 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4sbb4" event={"ID":"7c89560e-578e-4ca8-bb28-9608c190c546","Type":"ContainerStarted","Data":"647c4dd5975f259ee18134156760864f042fdd959442006f0b8d374dbbfdd701"} Apr 22 20:00:58.243926 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:58.243928 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4sbb4" event={"ID":"7c89560e-578e-4ca8-bb28-9608c190c546","Type":"ContainerStarted","Data":"f9fbb655a29ebc6f33625d1aec5aa648822162b83444a7edaec02fe2dfeb14a9"} Apr 22 20:00:58.244368 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:58.244026 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4sbb4" Apr 22 20:00:58.259696 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:00:58.259647 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4sbb4" podStartSLOduration=129.8963141 podStartE2EDuration="2m11.25962982s" podCreationTimestamp="2026-04-22 19:58:47 +0000 UTC" firstStartedPulling="2026-04-22 20:00:55.859089957 +0000 UTC m=+161.813452736" lastFinishedPulling="2026-04-22 20:00:57.222405675 +0000 UTC m=+163.176768456" observedRunningTime="2026-04-22 20:00:58.258698184 +0000 UTC m=+164.213060985" watchObservedRunningTime="2026-04-22 20:00:58.25962982 +0000 UTC m=+164.213992622" Apr 22 20:01:05.688657 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:01:05.688586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 20:01:05.691078 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:01:05.691059 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g9zd7\"" Apr 22 20:01:05.699867 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:01:05.699849 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wm9mw" Apr 22 20:01:05.811179 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:01:05.811133 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wm9mw"] Apr 22 20:01:05.815288 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:01:05.815261 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea808e89_1697_4235_8c42_8202cc97fef9.slice/crio-98c0249cd6aa3c2239417154a05b46130239057aec6784d6baf614ba74a14756 WatchSource:0}: Error finding container 98c0249cd6aa3c2239417154a05b46130239057aec6784d6baf614ba74a14756: Status 404 returned error can't find the container with id 98c0249cd6aa3c2239417154a05b46130239057aec6784d6baf614ba74a14756 Apr 22 20:01:06.265652 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:01:06.265602 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wm9mw" event={"ID":"ea808e89-1697-4235-8c42-8202cc97fef9","Type":"ContainerStarted","Data":"98c0249cd6aa3c2239417154a05b46130239057aec6784d6baf614ba74a14756"} Apr 22 20:01:08.249349 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:01:08.249315 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4sbb4" Apr 22 20:01:08.271780 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:01:08.271747 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wm9mw" event={"ID":"ea808e89-1697-4235-8c42-8202cc97fef9","Type":"ContainerStarted","Data":"675f684984d6c401a76ca116b4a649c21e08622d010e43950193d3237b427932"} Apr 22 20:01:08.324433 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:01:08.324388 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wm9mw" podStartSLOduration=139.548049901 podStartE2EDuration="2m21.324375047s" podCreationTimestamp="2026-04-22 19:58:47 +0000 UTC" firstStartedPulling="2026-04-22 20:01:05.817196125 +0000 UTC m=+171.771558904" lastFinishedPulling="2026-04-22 20:01:07.593521267 +0000 UTC m=+173.547884050" observedRunningTime="2026-04-22 20:01:08.322851642 +0000 UTC m=+174.277214442" watchObservedRunningTime="2026-04-22 20:01:08.324375047 +0000 UTC m=+174.278737847" Apr 22 20:03:14.556628 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:14.556604 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 20:03:20.375298 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.375259 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-cbtjp"] Apr 22 20:03:20.377324 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.377305 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:20.379583 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.379549 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 20:03:20.379583 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.379572 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 20:03:20.379583 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.379555 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 20:03:20.379765 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.379593 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-7rtsm\"" Apr 22 20:03:20.379765 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.379595 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 20:03:20.380239 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.380226 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 20:03:20.388604 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.388582 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-cbtjp"] Apr 22 20:03:20.476000 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.475982 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/04221c1b-c57f-4b18-b4ff-1f4238bd646a-cabundle0\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:20.476097 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.476021 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55xkm\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-kube-api-access-55xkm\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:20.476097 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.476052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:20.576716 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.576692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55xkm\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-kube-api-access-55xkm\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:20.576798 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.576725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:20.576798 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.576763 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/04221c1b-c57f-4b18-b4ff-1f4238bd646a-cabundle0\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:20.576873 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:20.576845 2579 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:03:20.576873 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:20.576862 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:03:20.576934 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:20.576873 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-cbtjp: references non-existent secret key: ca.crt Apr 22 20:03:20.576934 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:20.576930 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates podName:04221c1b-c57f-4b18-b4ff-1f4238bd646a nodeName:}" failed. No retries permitted until 2026-04-22 20:03:21.076910704 +0000 UTC m=+307.031273488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates") pod "keda-operator-ffbb595cb-cbtjp" (UID: "04221c1b-c57f-4b18-b4ff-1f4238bd646a") : references non-existent secret key: ca.crt Apr 22 20:03:20.577451 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.577433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/04221c1b-c57f-4b18-b4ff-1f4238bd646a-cabundle0\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:20.584864 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:20.584845 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55xkm\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-kube-api-access-55xkm\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:21.042247 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.042212 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-4w7gm"] Apr 22 20:03:21.044215 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.044195 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:21.046244 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.046222 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 20:03:21.057490 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.057467 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-4w7gm"] Apr 22 20:03:21.079685 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.079648 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:21.079809 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:21.079794 2579 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:03:21.079858 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:21.079813 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:03:21.079858 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:21.079821 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-cbtjp: references non-existent secret key: ca.crt Apr 22 20:03:21.079916 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:21.079869 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates podName:04221c1b-c57f-4b18-b4ff-1f4238bd646a nodeName:}" failed. No retries permitted until 2026-04-22 20:03:22.079852036 +0000 UTC m=+308.034214825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates") pod "keda-operator-ffbb595cb-cbtjp" (UID: "04221c1b-c57f-4b18-b4ff-1f4238bd646a") : references non-existent secret key: ca.crt Apr 22 20:03:21.180472 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.180440 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-certificates\") pod \"keda-admission-cf49989db-4w7gm\" (UID: \"d651081f-8028-484f-9d19-d90b3f8e13f8\") " pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:21.180590 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.180541 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v44gm\" (UniqueName: \"kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-kube-api-access-v44gm\") pod \"keda-admission-cf49989db-4w7gm\" (UID: \"d651081f-8028-484f-9d19-d90b3f8e13f8\") " pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:21.281553 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.281525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v44gm\" (UniqueName: \"kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-kube-api-access-v44gm\") pod \"keda-admission-cf49989db-4w7gm\" (UID: \"d651081f-8028-484f-9d19-d90b3f8e13f8\") " pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:21.281639 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.281566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-certificates\") pod \"keda-admission-cf49989db-4w7gm\" (UID: \"d651081f-8028-484f-9d19-d90b3f8e13f8\") " pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:21.281676 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:21.281658 2579 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 20:03:21.281676 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:21.281673 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-4w7gm: secret "keda-admission-webhooks-certs" not found Apr 22 20:03:21.281740 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:21.281717 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-certificates podName:d651081f-8028-484f-9d19-d90b3f8e13f8 nodeName:}" failed. No retries permitted until 2026-04-22 20:03:21.781704671 +0000 UTC m=+307.736067451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-certificates") pod "keda-admission-cf49989db-4w7gm" (UID: "d651081f-8028-484f-9d19-d90b3f8e13f8") : secret "keda-admission-webhooks-certs" not found Apr 22 20:03:21.291516 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.291488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v44gm\" (UniqueName: \"kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-kube-api-access-v44gm\") pod \"keda-admission-cf49989db-4w7gm\" (UID: \"d651081f-8028-484f-9d19-d90b3f8e13f8\") " pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:21.785295 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.785267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-certificates\") pod \"keda-admission-cf49989db-4w7gm\" (UID: \"d651081f-8028-484f-9d19-d90b3f8e13f8\") " pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:21.787703 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.787680 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d651081f-8028-484f-9d19-d90b3f8e13f8-certificates\") pod \"keda-admission-cf49989db-4w7gm\" (UID: \"d651081f-8028-484f-9d19-d90b3f8e13f8\") " pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:21.954005 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:21.953977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:22.066284 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:22.066203 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-4w7gm"] Apr 22 20:03:22.069201 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:03:22.069174 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd651081f_8028_484f_9d19_d90b3f8e13f8.slice/crio-974f39867c53144bc2be9d9381cdbcfabf1cce9c79cc9f66915f8119f75ee93e WatchSource:0}: Error finding container 974f39867c53144bc2be9d9381cdbcfabf1cce9c79cc9f66915f8119f75ee93e: Status 404 returned error can't find the container with id 974f39867c53144bc2be9d9381cdbcfabf1cce9c79cc9f66915f8119f75ee93e Apr 22 20:03:22.070428 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:22.070410 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:03:22.088402 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:22.088382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:22.088523 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:22.088508 2579 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:03:22.088565 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:22.088527 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:03:22.088565 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:22.088536 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-cbtjp: references non-existent secret key: ca.crt Apr 22 20:03:22.088626 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:22.088591 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates podName:04221c1b-c57f-4b18-b4ff-1f4238bd646a nodeName:}" failed. No retries permitted until 2026-04-22 20:03:24.088576155 +0000 UTC m=+310.042938934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates") pod "keda-operator-ffbb595cb-cbtjp" (UID: "04221c1b-c57f-4b18-b4ff-1f4238bd646a") : references non-existent secret key: ca.crt Apr 22 20:03:22.610651 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:22.610615 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-4w7gm" event={"ID":"d651081f-8028-484f-9d19-d90b3f8e13f8","Type":"ContainerStarted","Data":"974f39867c53144bc2be9d9381cdbcfabf1cce9c79cc9f66915f8119f75ee93e"} Apr 22 20:03:24.104234 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:24.104198 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:24.104629 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:24.104329 2579 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:03:24.104629 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:24.104342 2579 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:03:24.104629 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:24.104352 2579 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-cbtjp: references non-existent secret key: ca.crt Apr 22 20:03:24.104629 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:03:24.104405 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates podName:04221c1b-c57f-4b18-b4ff-1f4238bd646a nodeName:}" failed. No retries permitted until 2026-04-22 20:03:28.10438751 +0000 UTC m=+314.058750289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates") pod "keda-operator-ffbb595cb-cbtjp" (UID: "04221c1b-c57f-4b18-b4ff-1f4238bd646a") : references non-existent secret key: ca.crt Apr 22 20:03:24.618242 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:24.618207 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-4w7gm" event={"ID":"d651081f-8028-484f-9d19-d90b3f8e13f8","Type":"ContainerStarted","Data":"a24d45c41ca914b2bfa6696261416ad744281635abb2d3ecad74155a0829d3a3"} Apr 22 20:03:24.618384 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:24.618267 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:24.633045 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:24.632962 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-4w7gm" podStartSLOduration=1.321341307 podStartE2EDuration="3.632948409s" podCreationTimestamp="2026-04-22 20:03:21 +0000 UTC" firstStartedPulling="2026-04-22 20:03:22.070607389 +0000 UTC m=+308.024970172" lastFinishedPulling="2026-04-22 20:03:24.382214495 +0000 UTC m=+310.336577274" observedRunningTime="2026-04-22 20:03:24.632005203 +0000 UTC m=+310.586368004" watchObservedRunningTime="2026-04-22 20:03:24.632948409 +0000 UTC m=+310.587311210" Apr 22 20:03:28.130248 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:28.130216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:28.132711 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:28.132692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04221c1b-c57f-4b18-b4ff-1f4238bd646a-certificates\") pod \"keda-operator-ffbb595cb-cbtjp\" (UID: \"04221c1b-c57f-4b18-b4ff-1f4238bd646a\") " pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:28.186735 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:28.186695 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:28.297558 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:28.297522 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-cbtjp"] Apr 22 20:03:28.302164 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:03:28.302114 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04221c1b_c57f_4b18_b4ff_1f4238bd646a.slice/crio-0a44dbcb7cb0db87e4217f5d5af2bd6b348e00df0c76110fde5cada74a51d4b0 WatchSource:0}: Error finding container 0a44dbcb7cb0db87e4217f5d5af2bd6b348e00df0c76110fde5cada74a51d4b0: Status 404 returned error can't find the container with id 0a44dbcb7cb0db87e4217f5d5af2bd6b348e00df0c76110fde5cada74a51d4b0 Apr 22 20:03:28.629453 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:28.629421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" event={"ID":"04221c1b-c57f-4b18-b4ff-1f4238bd646a","Type":"ContainerStarted","Data":"0a44dbcb7cb0db87e4217f5d5af2bd6b348e00df0c76110fde5cada74a51d4b0"} Apr 22 20:03:31.638698 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:31.638668 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" event={"ID":"04221c1b-c57f-4b18-b4ff-1f4238bd646a","Type":"ContainerStarted","Data":"61eb3e85491c5fe74c72830d97f1e77e94e2d5686cb7b22aa1a02bbdfa72159a"} Apr 22 20:03:31.639085 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:31.638822 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:03:31.654959 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:31.654922 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" podStartSLOduration=8.739081352 podStartE2EDuration="11.654909961s" podCreationTimestamp="2026-04-22 20:03:20 +0000 UTC" firstStartedPulling="2026-04-22 20:03:28.303652904 +0000 UTC m=+314.258015696" lastFinishedPulling="2026-04-22 20:03:31.219481509 +0000 UTC m=+317.173844305" observedRunningTime="2026-04-22 20:03:31.653414477 +0000 UTC m=+317.607777278" watchObservedRunningTime="2026-04-22 20:03:31.654909961 +0000 UTC m=+317.609272761" Apr 22 20:03:45.623095 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:45.623061 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-4w7gm" Apr 22 20:03:52.643937 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:03:52.643909 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-cbtjp" Apr 22 20:04:26.520129 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.520051 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qk672"] Apr 22 20:04:26.527472 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.527438 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:26.532468 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.532442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:04:26.533513 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.533386 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 20:04:26.533513 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.533419 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:04:26.533513 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.533462 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qk672"] Apr 22 20:04:26.533513 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.533464 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-w8zj8\"" Apr 22 20:04:26.615597 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.615568 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n795z\" (UniqueName: \"kubernetes.io/projected/178c3516-513b-4bdc-b570-a419f78d41ba-kube-api-access-n795z\") pod \"llmisvc-controller-manager-68cc5db7c4-qk672\" (UID: \"178c3516-513b-4bdc-b570-a419f78d41ba\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:26.615719 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.615626 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/178c3516-513b-4bdc-b570-a419f78d41ba-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qk672\" (UID: \"178c3516-513b-4bdc-b570-a419f78d41ba\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:26.716797 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.716760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/178c3516-513b-4bdc-b570-a419f78d41ba-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qk672\" (UID: \"178c3516-513b-4bdc-b570-a419f78d41ba\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:26.716955 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.716814 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n795z\" (UniqueName: \"kubernetes.io/projected/178c3516-513b-4bdc-b570-a419f78d41ba-kube-api-access-n795z\") pod \"llmisvc-controller-manager-68cc5db7c4-qk672\" (UID: \"178c3516-513b-4bdc-b570-a419f78d41ba\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:26.719282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.719260 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/178c3516-513b-4bdc-b570-a419f78d41ba-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-qk672\" (UID: \"178c3516-513b-4bdc-b570-a419f78d41ba\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:26.724523 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.724503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n795z\" (UniqueName: \"kubernetes.io/projected/178c3516-513b-4bdc-b570-a419f78d41ba-kube-api-access-n795z\") pod \"llmisvc-controller-manager-68cc5db7c4-qk672\" (UID: \"178c3516-513b-4bdc-b570-a419f78d41ba\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:26.837933 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.837875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:26.955107 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:26.955076 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-qk672"] Apr 22 20:04:26.958227 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:04:26.958199 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod178c3516_513b_4bdc_b570_a419f78d41ba.slice/crio-ba63a3b138c9dfefb674a86c8f2570a2e082ebb01a90c8c3dddac4c5264263a6 WatchSource:0}: Error finding container ba63a3b138c9dfefb674a86c8f2570a2e082ebb01a90c8c3dddac4c5264263a6: Status 404 returned error can't find the container with id ba63a3b138c9dfefb674a86c8f2570a2e082ebb01a90c8c3dddac4c5264263a6 Apr 22 20:04:27.775713 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:27.775677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" event={"ID":"178c3516-513b-4bdc-b570-a419f78d41ba","Type":"ContainerStarted","Data":"ba63a3b138c9dfefb674a86c8f2570a2e082ebb01a90c8c3dddac4c5264263a6"} Apr 22 20:04:29.781905 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:29.781872 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" event={"ID":"178c3516-513b-4bdc-b570-a419f78d41ba","Type":"ContainerStarted","Data":"97de6638364d625668ea8d971b3083202ce13dd086d6f7c6b5e3543795b4d5a8"} Apr 22 20:04:29.782314 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:29.782031 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:04:29.798732 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:04:29.798686 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" podStartSLOduration=1.8901551699999999 podStartE2EDuration="3.798669906s" podCreationTimestamp="2026-04-22 20:04:26 +0000 UTC" firstStartedPulling="2026-04-22 20:04:26.959306786 +0000 UTC m=+372.913669566" lastFinishedPulling="2026-04-22 20:04:28.867821522 +0000 UTC m=+374.822184302" observedRunningTime="2026-04-22 20:04:29.797898325 +0000 UTC m=+375.752261126" watchObservedRunningTime="2026-04-22 20:04:29.798669906 +0000 UTC m=+375.753032712" Apr 22 20:05:00.787024 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:05:00.786996 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-qk672" Apr 22 20:09:38.271662 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:38.271629 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k"] Apr 22 20:09:38.274830 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:38.274816 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" Apr 22 20:09:38.276817 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:38.276791 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:09:38.284171 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:38.284115 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" Apr 22 20:09:38.284265 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:38.284214 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k"] Apr 22 20:09:38.406872 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:38.406842 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k"] Apr 22 20:09:38.409932 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:09:38.409901 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1454df49_6d25_49d8_8c20_be7b878dacc2.slice/crio-419d9b0830dace2079e4b057ad9622837a12ea38e36a5405714cb8b7951ffd83 WatchSource:0}: Error finding container 419d9b0830dace2079e4b057ad9622837a12ea38e36a5405714cb8b7951ffd83: Status 404 returned error can't find the container with id 419d9b0830dace2079e4b057ad9622837a12ea38e36a5405714cb8b7951ffd83 Apr 22 20:09:38.411711 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:38.411695 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:09:38.552828 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:38.552767 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" event={"ID":"1454df49-6d25-49d8-8c20-be7b878dacc2","Type":"ContainerStarted","Data":"419d9b0830dace2079e4b057ad9622837a12ea38e36a5405714cb8b7951ffd83"} Apr 22 20:09:39.556301 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:39.556270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" event={"ID":"1454df49-6d25-49d8-8c20-be7b878dacc2","Type":"ContainerStarted","Data":"602ea2c160d13db5af4800aaa31e1c36b2c12026f809cdda73e3a97ee789b0b2"} Apr 22 20:09:39.556618 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:39.556473 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" Apr 22 20:09:39.557822 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:39.557799 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" podUID="1454df49-6d25-49d8-8c20-be7b878dacc2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.16:8080: connect: connection refused" Apr 22 20:09:39.569049 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:39.569000 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" podStartSLOduration=0.588961501 podStartE2EDuration="1.568985153s" podCreationTimestamp="2026-04-22 20:09:38 +0000 UTC" firstStartedPulling="2026-04-22 20:09:38.411846459 +0000 UTC m=+684.366209238" lastFinishedPulling="2026-04-22 20:09:39.391870111 +0000 UTC m=+685.346232890" observedRunningTime="2026-04-22 20:09:39.567994807 +0000 UTC m=+685.522357632" watchObservedRunningTime="2026-04-22 20:09:39.568985153 +0000 UTC m=+685.523347960" Apr 22 20:09:40.560296 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:09:40.560212 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" Apr 22 20:11:13.355969 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:13.355940 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-nk64k_1454df49-6d25-49d8-8c20-be7b878dacc2/kserve-container/0.log" Apr 22 20:11:13.631945 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:13.631917 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k"] Apr 22 20:11:13.632177 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:13.632128 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" podUID="1454df49-6d25-49d8-8c20-be7b878dacc2" containerName="kserve-container" containerID="cri-o://602ea2c160d13db5af4800aaa31e1c36b2c12026f809cdda73e3a97ee789b0b2" gracePeriod=30 Apr 22 20:11:13.796449 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:13.796419 2579 generic.go:358] "Generic (PLEG): container finished" podID="1454df49-6d25-49d8-8c20-be7b878dacc2" containerID="602ea2c160d13db5af4800aaa31e1c36b2c12026f809cdda73e3a97ee789b0b2" exitCode=2 Apr 22 20:11:13.796557 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:13.796497 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" event={"ID":"1454df49-6d25-49d8-8c20-be7b878dacc2","Type":"ContainerDied","Data":"602ea2c160d13db5af4800aaa31e1c36b2c12026f809cdda73e3a97ee789b0b2"} Apr 22 20:11:13.855781 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:13.855761 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" Apr 22 20:11:14.799579 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:14.799545 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" event={"ID":"1454df49-6d25-49d8-8c20-be7b878dacc2","Type":"ContainerDied","Data":"419d9b0830dace2079e4b057ad9622837a12ea38e36a5405714cb8b7951ffd83"} Apr 22 20:11:14.799579 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:14.799581 2579 scope.go:117] "RemoveContainer" containerID="602ea2c160d13db5af4800aaa31e1c36b2c12026f809cdda73e3a97ee789b0b2" Apr 22 20:11:14.799998 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:14.799557 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k" Apr 22 20:11:14.814079 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:14.814059 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k"] Apr 22 20:11:14.817843 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:14.817823 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-nk64k"] Apr 22 20:11:16.692314 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:11:16.692274 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1454df49-6d25-49d8-8c20-be7b878dacc2" path="/var/lib/kubelet/pods/1454df49-6d25-49d8-8c20-be7b878dacc2/volumes" Apr 22 20:20:36.811758 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:36.811673 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv"] Apr 22 20:20:36.814110 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:36.812015 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1454df49-6d25-49d8-8c20-be7b878dacc2" containerName="kserve-container" Apr 22 20:20:36.814110 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:36.812032 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1454df49-6d25-49d8-8c20-be7b878dacc2" containerName="kserve-container" Apr 22 20:20:36.814110 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:36.812110 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1454df49-6d25-49d8-8c20-be7b878dacc2" containerName="kserve-container" Apr 22 20:20:36.814957 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:36.814938 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:20:36.816982 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:36.816965 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:20:36.823816 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:36.823776 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv"] Apr 22 20:20:36.902350 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:36.902311 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-kqwlv\" (UID: \"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:20:37.002996 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:37.002966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-kqwlv\" (UID: \"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:20:37.003308 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:37.003292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-kqwlv\" (UID: \"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:20:37.125468 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:37.125446 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:20:37.241689 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:37.241662 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv"] Apr 22 20:20:37.245777 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:20:37.245742 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc70a59ed_df3e_4c23_90eb_fcd3b2bcfb1f.slice/crio-05aebb6622ee492b9d730453b08ca990ad8f13f364cf10c00062495fd2df1237 WatchSource:0}: Error finding container 05aebb6622ee492b9d730453b08ca990ad8f13f364cf10c00062495fd2df1237: Status 404 returned error can't find the container with id 05aebb6622ee492b9d730453b08ca990ad8f13f364cf10c00062495fd2df1237 Apr 22 20:20:37.247879 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:37.247864 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:20:38.214968 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:38.214934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" event={"ID":"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f","Type":"ContainerStarted","Data":"05aebb6622ee492b9d730453b08ca990ad8f13f364cf10c00062495fd2df1237"} Apr 22 20:20:42.228671 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:42.228626 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" event={"ID":"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f","Type":"ContainerStarted","Data":"2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81"} Apr 22 20:20:46.241023 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:46.240996 2579 generic.go:358] "Generic (PLEG): container finished" podID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerID="2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81" exitCode=0 Apr 22 20:20:46.241328 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:20:46.241064 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" event={"ID":"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f","Type":"ContainerDied","Data":"2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81"} Apr 22 20:21:00.283411 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:00.283371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" event={"ID":"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f","Type":"ContainerStarted","Data":"be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c"} Apr 22 20:21:00.283899 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:00.283759 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:21:00.284736 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:00.284712 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 20:21:00.298764 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:00.298722 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" podStartSLOduration=2.107949149 podStartE2EDuration="24.298710417s" podCreationTimestamp="2026-04-22 20:20:36 +0000 UTC" firstStartedPulling="2026-04-22 20:20:37.247986437 +0000 UTC m=+1343.202349217" lastFinishedPulling="2026-04-22 20:20:59.438747702 +0000 UTC m=+1365.393110485" observedRunningTime="2026-04-22 20:21:00.297725798 +0000 UTC m=+1366.252088599" watchObservedRunningTime="2026-04-22 20:21:00.298710417 +0000 UTC m=+1366.253073217" Apr 22 20:21:01.286030 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:01.285991 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 20:21:11.286081 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:11.286045 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 20:21:21.286515 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:21.286469 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 20:21:31.286255 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:31.286212 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 20:21:41.287196 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:41.287166 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:21:48.273995 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:48.273963 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv"] Apr 22 20:21:48.274467 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:48.274341 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" containerID="cri-o://be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c" gracePeriod=30 Apr 22 20:21:50.600037 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:50.600016 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:21:50.622630 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:50.622609 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f-kserve-provision-location\") pod \"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f\" (UID: \"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f\") " Apr 22 20:21:50.631411 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:50.631383 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" (UID: "c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:21:50.723288 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:50.723269 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:21:51.418301 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.418267 2579 generic.go:358] "Generic (PLEG): container finished" podID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerID="be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c" exitCode=0 Apr 22 20:21:51.418447 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.418311 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" event={"ID":"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f","Type":"ContainerDied","Data":"be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c"} Apr 22 20:21:51.418447 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.418333 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" Apr 22 20:21:51.418447 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.418349 2579 scope.go:117] "RemoveContainer" containerID="be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c" Apr 22 20:21:51.418447 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.418338 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv" event={"ID":"c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f","Type":"ContainerDied","Data":"05aebb6622ee492b9d730453b08ca990ad8f13f364cf10c00062495fd2df1237"} Apr 22 20:21:51.429768 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.429748 2579 scope.go:117] "RemoveContainer" containerID="2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81" Apr 22 20:21:51.435678 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.435658 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv"] Apr 22 20:21:51.437077 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.437058 2579 scope.go:117] "RemoveContainer" containerID="be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c" Apr 22 20:21:51.437418 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:21:51.437390 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c\": container with ID starting with be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c not found: ID does not exist" containerID="be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c" Apr 22 20:21:51.437546 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.437522 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c"} err="failed to get container status \"be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c\": rpc error: code = NotFound desc = could not find container \"be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c\": container with ID starting with be0f427b2d15ac841ca6b22b7d1487a0d0f60490e80212965ccafe1e4123637c not found: ID does not exist" Apr 22 20:21:51.437649 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.437634 2579 scope.go:117] "RemoveContainer" containerID="2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81" Apr 22 20:21:51.437926 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:21:51.437908 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81\": container with ID starting with 2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81 not found: ID does not exist" containerID="2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81" Apr 22 20:21:51.437998 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.437936 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81"} err="failed to get container status \"2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81\": rpc error: code = NotFound desc = could not find container \"2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81\": container with ID starting with 2f4b777ec435349bb85cd214172d92790e51a8005a2c921805886d99755a5d81 not found: ID does not exist" Apr 22 20:21:51.438600 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:51.438585 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-kqwlv"] Apr 22 20:21:52.692158 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:21:52.692112 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" path="/var/lib/kubelet/pods/c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f/volumes" Apr 22 20:22:40.201127 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.201090 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz"] Apr 22 20:22:40.201561 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.201370 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="storage-initializer" Apr 22 20:22:40.201561 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.201382 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="storage-initializer" Apr 22 20:22:40.201561 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.201394 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" Apr 22 20:22:40.201561 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.201403 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" Apr 22 20:22:40.201561 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.201455 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c70a59ed-df3e-4c23-90eb-fcd3b2bcfb1f" containerName="kserve-container" Apr 22 20:22:40.207536 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.207507 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:22:40.209530 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.209510 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:22:40.211791 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.211769 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz"] Apr 22 20:22:40.324852 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.324826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd71a690-c2c2-40a7-b989-0a2b652f28bf-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-h8mjz\" (UID: \"dd71a690-c2c2-40a7-b989-0a2b652f28bf\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:22:40.425795 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.425771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd71a690-c2c2-40a7-b989-0a2b652f28bf-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-h8mjz\" (UID: \"dd71a690-c2c2-40a7-b989-0a2b652f28bf\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:22:40.426115 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.426097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd71a690-c2c2-40a7-b989-0a2b652f28bf-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-h8mjz\" (UID: \"dd71a690-c2c2-40a7-b989-0a2b652f28bf\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:22:40.517743 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.517677 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:22:40.635401 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:40.635375 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz"] Apr 22 20:22:40.638031 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:22:40.638003 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd71a690_c2c2_40a7_b989_0a2b652f28bf.slice/crio-e28fee6c364b8bfc2851e9e1336afce85ae77bbc2dcee7f3255d96aa68f9610d WatchSource:0}: Error finding container e28fee6c364b8bfc2851e9e1336afce85ae77bbc2dcee7f3255d96aa68f9610d: Status 404 returned error can't find the container with id e28fee6c364b8bfc2851e9e1336afce85ae77bbc2dcee7f3255d96aa68f9610d Apr 22 20:22:41.546352 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:41.546317 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" event={"ID":"dd71a690-c2c2-40a7-b989-0a2b652f28bf","Type":"ContainerStarted","Data":"0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472"} Apr 22 20:22:41.546697 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:41.546359 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" event={"ID":"dd71a690-c2c2-40a7-b989-0a2b652f28bf","Type":"ContainerStarted","Data":"e28fee6c364b8bfc2851e9e1336afce85ae77bbc2dcee7f3255d96aa68f9610d"} Apr 22 20:22:45.557570 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:45.557534 2579 generic.go:358] "Generic (PLEG): container finished" podID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerID="0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472" exitCode=0 Apr 22 20:22:45.558003 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:45.557610 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" event={"ID":"dd71a690-c2c2-40a7-b989-0a2b652f28bf","Type":"ContainerDied","Data":"0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472"} Apr 22 20:22:52.581083 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:52.581052 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" event={"ID":"dd71a690-c2c2-40a7-b989-0a2b652f28bf","Type":"ContainerStarted","Data":"0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151"} Apr 22 20:22:52.581512 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:52.581408 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:22:52.582770 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:52.582744 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:22:52.596797 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:52.596752 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podStartSLOduration=5.773110881 podStartE2EDuration="12.596738943s" podCreationTimestamp="2026-04-22 20:22:40 +0000 UTC" firstStartedPulling="2026-04-22 20:22:45.558640611 +0000 UTC m=+1471.513003390" lastFinishedPulling="2026-04-22 20:22:52.382268671 +0000 UTC m=+1478.336631452" observedRunningTime="2026-04-22 20:22:52.595511025 +0000 UTC m=+1478.549873825" watchObservedRunningTime="2026-04-22 20:22:52.596738943 +0000 UTC m=+1478.551101744" Apr 22 20:22:53.583548 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:22:53.583504 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:23:03.584420 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:23:03.584376 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:23:13.583463 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:23:13.583414 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:23:23.583653 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:23:23.583611 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:23:33.583999 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:23:33.583907 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:23:43.583456 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:23:43.583410 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:23:53.583700 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:23:53.583657 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:24:03.583977 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:03.583937 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 20:24:09.690301 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:09.690268 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:24:11.586231 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:11.586198 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz"] Apr 22 20:24:11.586607 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:11.586450 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" containerID="cri-o://0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151" gracePeriod=30 Apr 22 20:24:11.702890 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:11.702854 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m"] Apr 22 20:24:11.705826 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:11.705804 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:24:11.722380 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:11.722355 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m"] Apr 22 20:24:11.759948 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:11.759926 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39d0bea4-a132-48b9-ac1d-4c4f7cc084c3-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-8zw8m\" (UID: \"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:24:11.860674 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:11.860619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39d0bea4-a132-48b9-ac1d-4c4f7cc084c3-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-8zw8m\" (UID: \"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:24:11.860937 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:11.860921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39d0bea4-a132-48b9-ac1d-4c4f7cc084c3-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-8zw8m\" (UID: \"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:24:12.016932 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:12.016903 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:24:12.137582 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:12.137557 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m"] Apr 22 20:24:12.140557 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:24:12.140529 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d0bea4_a132_48b9_ac1d_4c4f7cc084c3.slice/crio-16f0c920aea2aa5a674b63c49d13cb4aaca89071519a4eadfd46cd29ba726291 WatchSource:0}: Error finding container 16f0c920aea2aa5a674b63c49d13cb4aaca89071519a4eadfd46cd29ba726291: Status 404 returned error can't find the container with id 16f0c920aea2aa5a674b63c49d13cb4aaca89071519a4eadfd46cd29ba726291 Apr 22 20:24:12.784179 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:12.784123 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" event={"ID":"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3","Type":"ContainerStarted","Data":"0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b"} Apr 22 20:24:12.784179 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:12.784177 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" event={"ID":"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3","Type":"ContainerStarted","Data":"16f0c920aea2aa5a674b63c49d13cb4aaca89071519a4eadfd46cd29ba726291"} Apr 22 20:24:14.729343 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.729320 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:24:14.776802 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.776750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd71a690-c2c2-40a7-b989-0a2b652f28bf-kserve-provision-location\") pod \"dd71a690-c2c2-40a7-b989-0a2b652f28bf\" (UID: \"dd71a690-c2c2-40a7-b989-0a2b652f28bf\") " Apr 22 20:24:14.777058 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.777039 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd71a690-c2c2-40a7-b989-0a2b652f28bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dd71a690-c2c2-40a7-b989-0a2b652f28bf" (UID: "dd71a690-c2c2-40a7-b989-0a2b652f28bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:24:14.790954 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.790926 2579 generic.go:358] "Generic (PLEG): container finished" podID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerID="0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151" exitCode=0 Apr 22 20:24:14.791046 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.790973 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" event={"ID":"dd71a690-c2c2-40a7-b989-0a2b652f28bf","Type":"ContainerDied","Data":"0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151"} Apr 22 20:24:14.791046 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.790989 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" Apr 22 20:24:14.791046 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.791004 2579 scope.go:117] "RemoveContainer" containerID="0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151" Apr 22 20:24:14.791182 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.790995 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz" event={"ID":"dd71a690-c2c2-40a7-b989-0a2b652f28bf","Type":"ContainerDied","Data":"e28fee6c364b8bfc2851e9e1336afce85ae77bbc2dcee7f3255d96aa68f9610d"} Apr 22 20:24:14.798614 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.798591 2579 scope.go:117] "RemoveContainer" containerID="0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472" Apr 22 20:24:14.805200 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.805186 2579 scope.go:117] "RemoveContainer" containerID="0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151" Apr 22 20:24:14.805424 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:24:14.805407 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151\": container with ID starting with 0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151 not found: ID does not exist" containerID="0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151" Apr 22 20:24:14.805472 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.805431 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151"} err="failed to get container status \"0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151\": rpc error: code = NotFound desc = could not find container \"0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151\": container with ID starting with 0c7c1b5e0ffa47da595df283c3ea6edd06bfa257d57aae5577f161d3b1a67151 not found: ID does not exist" Apr 22 20:24:14.805472 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.805447 2579 scope.go:117] "RemoveContainer" containerID="0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472" Apr 22 20:24:14.805659 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:24:14.805646 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472\": container with ID starting with 0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472 not found: ID does not exist" containerID="0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472" Apr 22 20:24:14.805696 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.805663 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472"} err="failed to get container status \"0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472\": rpc error: code = NotFound desc = could not find container \"0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472\": container with ID starting with 0ba2613fa534e15854637a72ac3970a41d7306b60073026451ea8b1704959472 not found: ID does not exist" Apr 22 20:24:14.810519 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.810476 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz"] Apr 22 20:24:14.812990 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.812969 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-h8mjz"] Apr 22 20:24:14.877265 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:14.877246 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd71a690-c2c2-40a7-b989-0a2b652f28bf-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:24:16.692177 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:16.692122 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" path="/var/lib/kubelet/pods/dd71a690-c2c2-40a7-b989-0a2b652f28bf/volumes" Apr 22 20:24:16.797786 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:16.797759 2579 generic.go:358] "Generic (PLEG): container finished" podID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerID="0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b" exitCode=0 Apr 22 20:24:16.797905 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:16.797837 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" event={"ID":"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3","Type":"ContainerDied","Data":"0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b"} Apr 22 20:24:17.802884 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:17.802850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" event={"ID":"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3","Type":"ContainerStarted","Data":"f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa"} Apr 22 20:24:17.803316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:17.803125 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:24:17.804318 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:17.804292 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:24:17.816425 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:17.816389 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podStartSLOduration=6.816377292 podStartE2EDuration="6.816377292s" podCreationTimestamp="2026-04-22 20:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:24:17.816056434 +0000 UTC m=+1563.770419234" watchObservedRunningTime="2026-04-22 20:24:17.816377292 +0000 UTC m=+1563.770740092" Apr 22 20:24:18.805868 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:18.805826 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:24:28.806644 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:28.806603 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:24:38.806749 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:38.806709 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:24:48.806462 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:48.806418 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:24:58.806760 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:24:58.806720 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:25:08.806691 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:08.806593 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:25:18.806444 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:18.806404 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:25:28.806606 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:28.806564 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 20:25:36.695306 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:36.695276 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:25:42.769066 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.769035 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m"] Apr 22 20:25:42.771482 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.769278 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" containerID="cri-o://f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa" gracePeriod=30 Apr 22 20:25:42.861458 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.861431 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk"] Apr 22 20:25:42.861680 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.861669 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="storage-initializer" Apr 22 20:25:42.861731 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.861682 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="storage-initializer" Apr 22 20:25:42.861731 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.861690 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" Apr 22 20:25:42.861731 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.861695 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" Apr 22 20:25:42.861827 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.861752 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd71a690-c2c2-40a7-b989-0a2b652f28bf" containerName="kserve-container" Apr 22 20:25:42.864472 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.864457 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:25:42.871680 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.871652 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk"] Apr 22 20:25:42.960899 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:42.960864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dad534b1-90d3-4a99-a38b-729726689396-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk\" (UID: \"dad534b1-90d3-4a99-a38b-729726689396\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:25:43.062102 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:43.062037 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dad534b1-90d3-4a99-a38b-729726689396-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk\" (UID: \"dad534b1-90d3-4a99-a38b-729726689396\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:25:43.062367 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:43.062352 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dad534b1-90d3-4a99-a38b-729726689396-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk\" (UID: \"dad534b1-90d3-4a99-a38b-729726689396\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:25:43.175893 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:43.175868 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:25:43.289903 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:43.289874 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk"] Apr 22 20:25:43.292389 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:25:43.292354 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddad534b1_90d3_4a99_a38b_729726689396.slice/crio-6c987b2cf2db9ed1b3399b0fb86ca2e562fdef5b892e7c67700abe4b655dff8e WatchSource:0}: Error finding container 6c987b2cf2db9ed1b3399b0fb86ca2e562fdef5b892e7c67700abe4b655dff8e: Status 404 returned error can't find the container with id 6c987b2cf2db9ed1b3399b0fb86ca2e562fdef5b892e7c67700abe4b655dff8e Apr 22 20:25:43.294062 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:43.294041 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:25:44.016913 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:44.016882 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" event={"ID":"dad534b1-90d3-4a99-a38b-729726689396","Type":"ContainerStarted","Data":"8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3"} Apr 22 20:25:44.017306 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:44.016920 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" event={"ID":"dad534b1-90d3-4a99-a38b-729726689396","Type":"ContainerStarted","Data":"6c987b2cf2db9ed1b3399b0fb86ca2e562fdef5b892e7c67700abe4b655dff8e"} Apr 22 20:25:45.914494 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:45.914475 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:25:45.983123 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:45.983060 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39d0bea4-a132-48b9-ac1d-4c4f7cc084c3-kserve-provision-location\") pod \"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3\" (UID: \"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3\") " Apr 22 20:25:45.983375 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:45.983353 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39d0bea4-a132-48b9-ac1d-4c4f7cc084c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" (UID: "39d0bea4-a132-48b9-ac1d-4c4f7cc084c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:25:46.022360 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.022333 2579 generic.go:358] "Generic (PLEG): container finished" podID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerID="f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa" exitCode=0 Apr 22 20:25:46.022451 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.022391 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" event={"ID":"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3","Type":"ContainerDied","Data":"f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa"} Apr 22 20:25:46.022451 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.022422 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" event={"ID":"39d0bea4-a132-48b9-ac1d-4c4f7cc084c3","Type":"ContainerDied","Data":"16f0c920aea2aa5a674b63c49d13cb4aaca89071519a4eadfd46cd29ba726291"} Apr 22 20:25:46.022451 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.022438 2579 scope.go:117] "RemoveContainer" containerID="f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa" Apr 22 20:25:46.022576 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.022452 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m" Apr 22 20:25:46.030430 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.030405 2579 scope.go:117] "RemoveContainer" containerID="0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b" Apr 22 20:25:46.037099 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.037083 2579 scope.go:117] "RemoveContainer" containerID="f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa" Apr 22 20:25:46.037405 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:25:46.037385 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa\": container with ID starting with f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa not found: ID does not exist" containerID="f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa" Apr 22 20:25:46.037465 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.037413 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa"} err="failed to get container status \"f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa\": rpc error: code = NotFound desc = could not find container \"f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa\": container with ID starting with f20c6d77bf5d9c2f3fc3f90ab72d6b7705f4b4c9d45823aa85681a6a85c0c7aa not found: ID does not exist" Apr 22 20:25:46.037465 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.037430 2579 scope.go:117] "RemoveContainer" containerID="0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b" Apr 22 20:25:46.037640 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:25:46.037627 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b\": container with ID starting with 0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b not found: ID does not exist" containerID="0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b" Apr 22 20:25:46.037679 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.037642 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b"} err="failed to get container status \"0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b\": rpc error: code = NotFound desc = could not find container \"0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b\": container with ID starting with 0a0b4c668226e574d8f5a3c082bcb8ca6216e0ab8594d1a70e8686e22592136b not found: ID does not exist" Apr 22 20:25:46.041737 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.041707 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m"] Apr 22 20:25:46.045474 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.045456 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-8zw8m"] Apr 22 20:25:46.084005 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.083984 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39d0bea4-a132-48b9-ac1d-4c4f7cc084c3-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:25:46.692663 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:46.692634 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" path="/var/lib/kubelet/pods/39d0bea4-a132-48b9-ac1d-4c4f7cc084c3/volumes" Apr 22 20:25:47.025736 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:47.025660 2579 generic.go:358] "Generic (PLEG): container finished" podID="dad534b1-90d3-4a99-a38b-729726689396" containerID="8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3" exitCode=0 Apr 22 20:25:47.026073 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:47.025740 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" event={"ID":"dad534b1-90d3-4a99-a38b-729726689396","Type":"ContainerDied","Data":"8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3"} Apr 22 20:25:48.031213 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:48.031181 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" event={"ID":"dad534b1-90d3-4a99-a38b-729726689396","Type":"ContainerStarted","Data":"1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553"} Apr 22 20:25:48.031576 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:48.031459 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:25:48.032626 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:48.032603 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:25:48.047419 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:48.047381 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podStartSLOduration=6.047368414 podStartE2EDuration="6.047368414s" podCreationTimestamp="2026-04-22 20:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:25:48.045580194 +0000 UTC m=+1653.999942996" watchObservedRunningTime="2026-04-22 20:25:48.047368414 +0000 UTC m=+1654.001731265" Apr 22 20:25:49.033704 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:49.033664 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:25:59.034283 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:25:59.034238 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:26:09.034301 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:26:09.034257 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:26:19.033814 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:26:19.033769 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:26:29.034318 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:26:29.034282 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:26:39.034202 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:26:39.034100 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:26:49.033800 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:26:49.033753 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:26:49.689329 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:26:49.689291 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:26:59.689387 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:26:59.689346 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 20:27:09.690475 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:09.690440 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:27:13.993159 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:13.993106 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk"] Apr 22 20:27:13.993526 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:13.993382 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" containerID="cri-o://1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553" gracePeriod=30 Apr 22 20:27:17.126408 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.126387 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:27:17.183560 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.183485 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dad534b1-90d3-4a99-a38b-729726689396-kserve-provision-location\") pod \"dad534b1-90d3-4a99-a38b-729726689396\" (UID: \"dad534b1-90d3-4a99-a38b-729726689396\") " Apr 22 20:27:17.183810 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.183789 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dad534b1-90d3-4a99-a38b-729726689396-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dad534b1-90d3-4a99-a38b-729726689396" (UID: "dad534b1-90d3-4a99-a38b-729726689396"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:27:17.258521 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.258490 2579 generic.go:358] "Generic (PLEG): container finished" podID="dad534b1-90d3-4a99-a38b-729726689396" containerID="1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553" exitCode=0 Apr 22 20:27:17.258636 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.258550 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" event={"ID":"dad534b1-90d3-4a99-a38b-729726689396","Type":"ContainerDied","Data":"1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553"} Apr 22 20:27:17.258636 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.258585 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" event={"ID":"dad534b1-90d3-4a99-a38b-729726689396","Type":"ContainerDied","Data":"6c987b2cf2db9ed1b3399b0fb86ca2e562fdef5b892e7c67700abe4b655dff8e"} Apr 22 20:27:17.258636 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.258606 2579 scope.go:117] "RemoveContainer" containerID="1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553" Apr 22 20:27:17.258636 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.258630 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk" Apr 22 20:27:17.266412 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.266393 2579 scope.go:117] "RemoveContainer" containerID="8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3" Apr 22 20:27:17.273022 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.273007 2579 scope.go:117] "RemoveContainer" containerID="1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553" Apr 22 20:27:17.273302 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:27:17.273283 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553\": container with ID starting with 1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553 not found: ID does not exist" containerID="1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553" Apr 22 20:27:17.273361 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.273310 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553"} err="failed to get container status \"1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553\": rpc error: code = NotFound desc = could not find container \"1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553\": container with ID starting with 1e3a73112cb982000ea2ea085d0e65d8a8b2c0243fcf7a47503d9a1e026fd553 not found: ID does not exist" Apr 22 20:27:17.273361 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.273328 2579 scope.go:117] "RemoveContainer" containerID="8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3" Apr 22 20:27:17.273544 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:27:17.273526 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3\": container with ID starting with 8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3 not found: ID does not exist" containerID="8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3" Apr 22 20:27:17.273582 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.273550 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3"} err="failed to get container status \"8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3\": rpc error: code = NotFound desc = could not find container \"8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3\": container with ID starting with 8f60a19cde98a0aace0b146dd796a7a18fa08110b6730b8d58521e02721dc3b3 not found: ID does not exist" Apr 22 20:27:17.277443 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.277423 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk"] Apr 22 20:27:17.280864 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.280845 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-7xzrk"] Apr 22 20:27:17.284814 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:17.284795 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dad534b1-90d3-4a99-a38b-729726689396-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:27:18.692078 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:27:18.692042 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dad534b1-90d3-4a99-a38b-729726689396" path="/var/lib/kubelet/pods/dad534b1-90d3-4a99-a38b-729726689396/volumes" Apr 22 20:28:58.510829 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.510792 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c"] Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511030 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511042 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511050 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511055 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511063 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="storage-initializer" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511070 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="storage-initializer" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511084 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="storage-initializer" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511089 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="storage-initializer" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511127 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="39d0bea4-a132-48b9-ac1d-4c4f7cc084c3" containerName="kserve-container" Apr 22 20:28:58.511282 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.511134 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="dad534b1-90d3-4a99-a38b-729726689396" containerName="kserve-container" Apr 22 20:28:58.513974 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.513951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:28:58.516004 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.515986 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:28:58.523688 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.523666 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c"] Apr 22 20:28:58.650994 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.650961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3d999a6-b894-46af-9f67-f1b0ff7be28b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-h4v7c\" (UID: \"d3d999a6-b894-46af-9f67-f1b0ff7be28b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:28:58.751278 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.751244 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3d999a6-b894-46af-9f67-f1b0ff7be28b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-h4v7c\" (UID: \"d3d999a6-b894-46af-9f67-f1b0ff7be28b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:28:58.751589 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.751571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3d999a6-b894-46af-9f67-f1b0ff7be28b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-h4v7c\" (UID: \"d3d999a6-b894-46af-9f67-f1b0ff7be28b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:28:58.824056 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.823999 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:28:58.944460 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:58.944426 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c"] Apr 22 20:28:58.947632 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:28:58.947603 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3d999a6_b894_46af_9f67_f1b0ff7be28b.slice/crio-46de9b89534ccb033e0db3cad405e38dbd2d4465958e44f3e21a94b5ac881129 WatchSource:0}: Error finding container 46de9b89534ccb033e0db3cad405e38dbd2d4465958e44f3e21a94b5ac881129: Status 404 returned error can't find the container with id 46de9b89534ccb033e0db3cad405e38dbd2d4465958e44f3e21a94b5ac881129 Apr 22 20:28:59.519239 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:59.519195 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" event={"ID":"d3d999a6-b894-46af-9f67-f1b0ff7be28b","Type":"ContainerStarted","Data":"1191c5eeb6db83b22c4dd89c9d161316450ad179b8ff3a298710604488b5fad6"} Apr 22 20:28:59.519239 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:28:59.519241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" event={"ID":"d3d999a6-b894-46af-9f67-f1b0ff7be28b","Type":"ContainerStarted","Data":"46de9b89534ccb033e0db3cad405e38dbd2d4465958e44f3e21a94b5ac881129"} Apr 22 20:29:03.533281 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:03.533247 2579 generic.go:358] "Generic (PLEG): container finished" podID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerID="1191c5eeb6db83b22c4dd89c9d161316450ad179b8ff3a298710604488b5fad6" exitCode=0 Apr 22 20:29:03.533622 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:03.533289 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" event={"ID":"d3d999a6-b894-46af-9f67-f1b0ff7be28b","Type":"ContainerDied","Data":"1191c5eeb6db83b22c4dd89c9d161316450ad179b8ff3a298710604488b5fad6"} Apr 22 20:29:28.611162 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:28.611081 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" event={"ID":"d3d999a6-b894-46af-9f67-f1b0ff7be28b","Type":"ContainerStarted","Data":"db1178c5e080be71580e511a3508ac5a46f9942a33fff01953541b0ef7881507"} Apr 22 20:29:28.611497 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:28.611387 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:29:28.612357 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:28.612336 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:29:28.626548 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:28.626503 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podStartSLOduration=5.885177296 podStartE2EDuration="30.626491324s" podCreationTimestamp="2026-04-22 20:28:58 +0000 UTC" firstStartedPulling="2026-04-22 20:29:03.534394375 +0000 UTC m=+1849.488757154" lastFinishedPulling="2026-04-22 20:29:28.275708398 +0000 UTC m=+1874.230071182" observedRunningTime="2026-04-22 20:29:28.625034628 +0000 UTC m=+1874.579397429" watchObservedRunningTime="2026-04-22 20:29:28.626491324 +0000 UTC m=+1874.580854125" Apr 22 20:29:29.613985 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:29.613944 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:29:39.614587 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:39.614504 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:29:49.614385 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:49.614339 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:29:59.614275 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:29:59.614233 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:30:09.614123 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:09.614082 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:30:19.614927 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:19.614878 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:30:29.614528 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:29.614489 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:30:39.614899 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:39.614868 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:30:48.713948 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:48.713915 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c"] Apr 22 20:30:48.716465 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:48.714181 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" containerID="cri-o://db1178c5e080be71580e511a3508ac5a46f9942a33fff01953541b0ef7881507" gracePeriod=30 Apr 22 20:30:48.788318 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:48.788290 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d"] Apr 22 20:30:48.791311 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:48.791297 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:30:48.798845 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:48.798823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d75314-94e5-4826-9437-19ed59b8ac61-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d\" (UID: \"76d75314-94e5-4826-9437-19ed59b8ac61\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:30:48.801421 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:48.801395 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d"] Apr 22 20:30:48.900009 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:48.899973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d75314-94e5-4826-9437-19ed59b8ac61-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d\" (UID: \"76d75314-94e5-4826-9437-19ed59b8ac61\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:30:48.900428 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:48.900408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d75314-94e5-4826-9437-19ed59b8ac61-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d\" (UID: \"76d75314-94e5-4826-9437-19ed59b8ac61\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:30:49.101059 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:49.100973 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:30:49.217899 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:49.217870 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d"] Apr 22 20:30:49.221083 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:30:49.221056 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d75314_94e5_4826_9437_19ed59b8ac61.slice/crio-e913f9c515a7fe1c89e20675b8c6d5358c67d95689de260887ddf500b76f4be8 WatchSource:0}: Error finding container e913f9c515a7fe1c89e20675b8c6d5358c67d95689de260887ddf500b76f4be8: Status 404 returned error can't find the container with id e913f9c515a7fe1c89e20675b8c6d5358c67d95689de260887ddf500b76f4be8 Apr 22 20:30:49.222773 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:49.222755 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:30:49.614519 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:49.614474 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 20:30:49.818076 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:49.818041 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" event={"ID":"76d75314-94e5-4826-9437-19ed59b8ac61","Type":"ContainerStarted","Data":"719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628"} Apr 22 20:30:49.818449 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:49.818085 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" event={"ID":"76d75314-94e5-4826-9437-19ed59b8ac61","Type":"ContainerStarted","Data":"e913f9c515a7fe1c89e20675b8c6d5358c67d95689de260887ddf500b76f4be8"} Apr 22 20:30:52.827480 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:52.827448 2579 generic.go:358] "Generic (PLEG): container finished" podID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerID="db1178c5e080be71580e511a3508ac5a46f9942a33fff01953541b0ef7881507" exitCode=0 Apr 22 20:30:52.827782 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:52.827492 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" event={"ID":"d3d999a6-b894-46af-9f67-f1b0ff7be28b","Type":"ContainerDied","Data":"db1178c5e080be71580e511a3508ac5a46f9942a33fff01953541b0ef7881507"} Apr 22 20:30:52.852004 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:52.851986 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:30:52.925549 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:52.925488 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3d999a6-b894-46af-9f67-f1b0ff7be28b-kserve-provision-location\") pod \"d3d999a6-b894-46af-9f67-f1b0ff7be28b\" (UID: \"d3d999a6-b894-46af-9f67-f1b0ff7be28b\") " Apr 22 20:30:52.925793 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:52.925771 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d999a6-b894-46af-9f67-f1b0ff7be28b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d3d999a6-b894-46af-9f67-f1b0ff7be28b" (UID: "d3d999a6-b894-46af-9f67-f1b0ff7be28b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:30:53.026326 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.026302 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3d999a6-b894-46af-9f67-f1b0ff7be28b-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:30:53.832071 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.832040 2579 generic.go:358] "Generic (PLEG): container finished" podID="76d75314-94e5-4826-9437-19ed59b8ac61" containerID="719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628" exitCode=0 Apr 22 20:30:53.832490 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.832121 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" event={"ID":"76d75314-94e5-4826-9437-19ed59b8ac61","Type":"ContainerDied","Data":"719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628"} Apr 22 20:30:53.833569 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.833548 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" event={"ID":"d3d999a6-b894-46af-9f67-f1b0ff7be28b","Type":"ContainerDied","Data":"46de9b89534ccb033e0db3cad405e38dbd2d4465958e44f3e21a94b5ac881129"} Apr 22 20:30:53.833646 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.833589 2579 scope.go:117] "RemoveContainer" containerID="db1178c5e080be71580e511a3508ac5a46f9942a33fff01953541b0ef7881507" Apr 22 20:30:53.833646 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.833597 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c" Apr 22 20:30:53.841737 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.841716 2579 scope.go:117] "RemoveContainer" containerID="1191c5eeb6db83b22c4dd89c9d161316450ad179b8ff3a298710604488b5fad6" Apr 22 20:30:53.859632 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.859611 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c"] Apr 22 20:30:53.861513 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:53.861491 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-h4v7c"] Apr 22 20:30:54.693302 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:54.693271 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" path="/var/lib/kubelet/pods/d3d999a6-b894-46af-9f67-f1b0ff7be28b/volumes" Apr 22 20:30:54.837444 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:54.837413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" event={"ID":"76d75314-94e5-4826-9437-19ed59b8ac61","Type":"ContainerStarted","Data":"aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0"} Apr 22 20:30:54.837840 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:54.837705 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:30:54.840747 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:54.840611 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 20:30:54.856596 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:54.856559 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podStartSLOduration=6.856546442 podStartE2EDuration="6.856546442s" podCreationTimestamp="2026-04-22 20:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:30:54.854923326 +0000 UTC m=+1960.809286138" watchObservedRunningTime="2026-04-22 20:30:54.856546442 +0000 UTC m=+1960.810909297" Apr 22 20:30:55.843069 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:30:55.843035 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 20:31:05.843514 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:31:05.843428 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 20:31:15.843778 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:31:15.843739 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 20:31:25.843657 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:31:25.843618 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 20:31:35.843175 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:31:35.843116 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 20:31:45.843417 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:31:45.843379 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 20:31:55.843337 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:31:55.843296 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 20:32:05.843960 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:05.843931 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:32:08.892300 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.892266 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d"] Apr 22 20:32:08.892696 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.892633 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" containerID="cri-o://aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0" gracePeriod=30 Apr 22 20:32:08.942564 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.942540 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d"] Apr 22 20:32:08.942816 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.942805 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" Apr 22 20:32:08.942856 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.942817 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" Apr 22 20:32:08.942856 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.942826 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="storage-initializer" Apr 22 20:32:08.942856 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.942831 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="storage-initializer" Apr 22 20:32:08.942945 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.942870 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3d999a6-b894-46af-9f67-f1b0ff7be28b" containerName="kserve-container" Apr 22 20:32:08.945754 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.945737 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:32:08.953767 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:08.953744 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d"] Apr 22 20:32:09.032409 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:09.032377 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4e0964f-cd80-4657-a859-18c446344122-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-6pk5d\" (UID: \"f4e0964f-cd80-4657-a859-18c446344122\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:32:09.132725 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:09.132698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4e0964f-cd80-4657-a859-18c446344122-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-6pk5d\" (UID: \"f4e0964f-cd80-4657-a859-18c446344122\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:32:09.133087 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:09.133062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4e0964f-cd80-4657-a859-18c446344122-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-6pk5d\" (UID: \"f4e0964f-cd80-4657-a859-18c446344122\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:32:09.256508 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:09.256435 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:32:09.376506 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:09.376479 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d"] Apr 22 20:32:09.379286 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:32:09.379257 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e0964f_cd80_4657_a859_18c446344122.slice/crio-1d6926311eb8242eaa3e6809f35d106e983f5234263781fa17e0ece145948f32 WatchSource:0}: Error finding container 1d6926311eb8242eaa3e6809f35d106e983f5234263781fa17e0ece145948f32: Status 404 returned error can't find the container with id 1d6926311eb8242eaa3e6809f35d106e983f5234263781fa17e0ece145948f32 Apr 22 20:32:10.028580 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:10.028538 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" event={"ID":"f4e0964f-cd80-4657-a859-18c446344122","Type":"ContainerStarted","Data":"d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93"} Apr 22 20:32:10.028580 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:10.028582 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" event={"ID":"f4e0964f-cd80-4657-a859-18c446344122","Type":"ContainerStarted","Data":"1d6926311eb8242eaa3e6809f35d106e983f5234263781fa17e0ece145948f32"} Apr 22 20:32:13.035675 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.035648 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:32:13.037743 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.037705 2579 generic.go:358] "Generic (PLEG): container finished" podID="76d75314-94e5-4826-9437-19ed59b8ac61" containerID="aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0" exitCode=0 Apr 22 20:32:13.037850 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.037768 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" event={"ID":"76d75314-94e5-4826-9437-19ed59b8ac61","Type":"ContainerDied","Data":"aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0"} Apr 22 20:32:13.037850 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.037776 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" Apr 22 20:32:13.037850 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.037795 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d" event={"ID":"76d75314-94e5-4826-9437-19ed59b8ac61","Type":"ContainerDied","Data":"e913f9c515a7fe1c89e20675b8c6d5358c67d95689de260887ddf500b76f4be8"} Apr 22 20:32:13.037850 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.037809 2579 scope.go:117] "RemoveContainer" containerID="aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0" Apr 22 20:32:13.044809 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.044780 2579 scope.go:117] "RemoveContainer" containerID="719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628" Apr 22 20:32:13.051829 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.051799 2579 scope.go:117] "RemoveContainer" containerID="aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0" Apr 22 20:32:13.052096 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:32:13.052072 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0\": container with ID starting with aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0 not found: ID does not exist" containerID="aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0" Apr 22 20:32:13.052194 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.052109 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0"} err="failed to get container status \"aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0\": rpc error: code = NotFound desc = could not find container \"aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0\": container with ID starting with aed7f0a41f0eb94b669f24c1e916cb7dc7a63b72d1d16a85a6100cfb889e0ef0 not found: ID does not exist" Apr 22 20:32:13.052194 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.052153 2579 scope.go:117] "RemoveContainer" containerID="719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628" Apr 22 20:32:13.052445 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:32:13.052423 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628\": container with ID starting with 719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628 not found: ID does not exist" containerID="719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628" Apr 22 20:32:13.052514 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.052455 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628"} err="failed to get container status \"719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628\": rpc error: code = NotFound desc = could not find container \"719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628\": container with ID starting with 719a9b243b66e5a29fc040f7c6b115e77901d372612fac0edd3ecdee7206f628 not found: ID does not exist" Apr 22 20:32:13.164723 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.164698 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d75314-94e5-4826-9437-19ed59b8ac61-kserve-provision-location\") pod \"76d75314-94e5-4826-9437-19ed59b8ac61\" (UID: \"76d75314-94e5-4826-9437-19ed59b8ac61\") " Apr 22 20:32:13.165010 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.164990 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d75314-94e5-4826-9437-19ed59b8ac61-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "76d75314-94e5-4826-9437-19ed59b8ac61" (UID: "76d75314-94e5-4826-9437-19ed59b8ac61"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:32:13.266095 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.266058 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d75314-94e5-4826-9437-19ed59b8ac61-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:32:13.357610 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.357586 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d"] Apr 22 20:32:13.362817 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:13.362797 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-bxx7d"] Apr 22 20:32:14.042082 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:14.042051 2579 generic.go:358] "Generic (PLEG): container finished" podID="f4e0964f-cd80-4657-a859-18c446344122" containerID="d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93" exitCode=0 Apr 22 20:32:14.042457 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:14.042124 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" event={"ID":"f4e0964f-cd80-4657-a859-18c446344122","Type":"ContainerDied","Data":"d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93"} Apr 22 20:32:14.692850 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:14.692817 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" path="/var/lib/kubelet/pods/76d75314-94e5-4826-9437-19ed59b8ac61/volumes" Apr 22 20:32:15.045920 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:15.045838 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" event={"ID":"f4e0964f-cd80-4657-a859-18c446344122","Type":"ContainerStarted","Data":"809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531"} Apr 22 20:32:15.046320 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:15.046116 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:32:15.047468 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:15.047446 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:32:15.060783 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:15.060736 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podStartSLOduration=7.060724518 podStartE2EDuration="7.060724518s" podCreationTimestamp="2026-04-22 20:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:32:15.059548291 +0000 UTC m=+2041.013911091" watchObservedRunningTime="2026-04-22 20:32:15.060724518 +0000 UTC m=+2041.015087317" Apr 22 20:32:16.049185 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:16.049127 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:32:26.049923 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:26.049884 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:32:36.049461 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:36.049375 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:32:46.049827 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:46.049786 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:32:56.049922 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:32:56.049876 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:33:06.049645 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:06.049604 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:33:16.049344 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:16.049301 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:33:16.689182 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:16.689115 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 20:33:26.692419 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:26.692394 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:33:29.056602 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.056568 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d"] Apr 22 20:33:29.056985 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.056828 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" containerID="cri-o://809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531" gracePeriod=30 Apr 22 20:33:29.111482 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.111453 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g"] Apr 22 20:33:29.111693 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.111682 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="storage-initializer" Apr 22 20:33:29.111735 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.111695 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="storage-initializer" Apr 22 20:33:29.111735 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.111717 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" Apr 22 20:33:29.111735 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.111722 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" Apr 22 20:33:29.111825 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.111773 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="76d75314-94e5-4826-9437-19ed59b8ac61" containerName="kserve-container" Apr 22 20:33:29.114580 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.114564 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:33:29.123226 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.123198 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g"] Apr 22 20:33:29.167790 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.167766 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a70dadb8-a1d8-4de0-b677-809132ecfbee-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g\" (UID: \"a70dadb8-a1d8-4de0-b677-809132ecfbee\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:33:29.268075 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.268045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a70dadb8-a1d8-4de0-b677-809132ecfbee-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g\" (UID: \"a70dadb8-a1d8-4de0-b677-809132ecfbee\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:33:29.268394 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.268377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a70dadb8-a1d8-4de0-b677-809132ecfbee-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g\" (UID: \"a70dadb8-a1d8-4de0-b677-809132ecfbee\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:33:29.423998 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.423970 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:33:29.540986 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:29.540957 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g"] Apr 22 20:33:29.545097 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:33:29.545070 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda70dadb8_a1d8_4de0_b677_809132ecfbee.slice/crio-13dd75d78bf48a6a9c6e3de6700beeeac41e9255821b656cf055a86044936764 WatchSource:0}: Error finding container 13dd75d78bf48a6a9c6e3de6700beeeac41e9255821b656cf055a86044936764: Status 404 returned error can't find the container with id 13dd75d78bf48a6a9c6e3de6700beeeac41e9255821b656cf055a86044936764 Apr 22 20:33:30.240257 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:30.240220 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" event={"ID":"a70dadb8-a1d8-4de0-b677-809132ecfbee","Type":"ContainerStarted","Data":"1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d"} Apr 22 20:33:30.240257 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:30.240256 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" event={"ID":"a70dadb8-a1d8-4de0-b677-809132ecfbee","Type":"ContainerStarted","Data":"13dd75d78bf48a6a9c6e3de6700beeeac41e9255821b656cf055a86044936764"} Apr 22 20:33:34.250336 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:34.250303 2579 generic.go:358] "Generic (PLEG): container finished" podID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerID="1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d" exitCode=0 Apr 22 20:33:34.250725 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:34.250379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" event={"ID":"a70dadb8-a1d8-4de0-b677-809132ecfbee","Type":"ContainerDied","Data":"1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d"} Apr 22 20:33:34.890985 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:34.890962 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:33:34.903129 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:34.903106 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4e0964f-cd80-4657-a859-18c446344122-kserve-provision-location\") pod \"f4e0964f-cd80-4657-a859-18c446344122\" (UID: \"f4e0964f-cd80-4657-a859-18c446344122\") " Apr 22 20:33:34.903406 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:34.903384 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e0964f-cd80-4657-a859-18c446344122-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f4e0964f-cd80-4657-a859-18c446344122" (UID: "f4e0964f-cd80-4657-a859-18c446344122"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:33:35.004363 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.004303 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4e0964f-cd80-4657-a859-18c446344122-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:33:35.254683 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.254591 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" event={"ID":"a70dadb8-a1d8-4de0-b677-809132ecfbee","Type":"ContainerStarted","Data":"b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a"} Apr 22 20:33:35.255101 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.254924 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:33:35.255861 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.255838 2579 generic.go:358] "Generic (PLEG): container finished" podID="f4e0964f-cd80-4657-a859-18c446344122" containerID="809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531" exitCode=0 Apr 22 20:33:35.255937 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.255910 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" event={"ID":"f4e0964f-cd80-4657-a859-18c446344122","Type":"ContainerDied","Data":"809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531"} Apr 22 20:33:35.255937 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.255929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" event={"ID":"f4e0964f-cd80-4657-a859-18c446344122","Type":"ContainerDied","Data":"1d6926311eb8242eaa3e6809f35d106e983f5234263781fa17e0ece145948f32"} Apr 22 20:33:35.256024 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.255945 2579 scope.go:117] "RemoveContainer" containerID="809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531" Apr 22 20:33:35.256024 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.255952 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d" Apr 22 20:33:35.263435 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.263417 2579 scope.go:117] "RemoveContainer" containerID="d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93" Apr 22 20:33:35.269925 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.269909 2579 scope.go:117] "RemoveContainer" containerID="809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531" Apr 22 20:33:35.270204 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:33:35.270178 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531\": container with ID starting with 809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531 not found: ID does not exist" containerID="809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531" Apr 22 20:33:35.270286 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.270210 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531"} err="failed to get container status \"809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531\": rpc error: code = NotFound desc = could not find container \"809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531\": container with ID starting with 809927d6c5878363debdab3d12a9e3e8083be41970fa59ae8cffa92f85f1e531 not found: ID does not exist" Apr 22 20:33:35.270286 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.270227 2579 scope.go:117] "RemoveContainer" containerID="d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93" Apr 22 20:33:35.270462 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:33:35.270446 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93\": container with ID starting with d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93 not found: ID does not exist" containerID="d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93" Apr 22 20:33:35.270495 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.270468 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93"} err="failed to get container status \"d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93\": rpc error: code = NotFound desc = could not find container \"d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93\": container with ID starting with d515e8a648b3738b16cf3c34578dde92466e1a7c10efa57c7eff8855ded03b93 not found: ID does not exist" Apr 22 20:33:35.272279 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.272247 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" podStartSLOduration=6.272237647 podStartE2EDuration="6.272237647s" podCreationTimestamp="2026-04-22 20:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:33:35.271750296 +0000 UTC m=+2121.226113079" watchObservedRunningTime="2026-04-22 20:33:35.272237647 +0000 UTC m=+2121.226600448" Apr 22 20:33:35.284297 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.284273 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d"] Apr 22 20:33:35.287225 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:35.287203 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-6pk5d"] Apr 22 20:33:36.693279 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:33:36.693247 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e0964f-cd80-4657-a859-18c446344122" path="/var/lib/kubelet/pods/f4e0964f-cd80-4657-a859-18c446344122/volumes" Apr 22 20:34:06.263714 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:06.263632 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 20:34:16.262594 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:16.262548 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 20:34:26.262383 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:26.262339 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 20:34:36.262289 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:36.262252 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 20:34:39.690030 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:39.689978 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 20:34:49.693374 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:49.693342 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:34:59.278996 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.278964 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g"] Apr 22 20:34:59.279514 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.279321 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" containerID="cri-o://b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a" gracePeriod=30 Apr 22 20:34:59.356795 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.356771 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d"] Apr 22 20:34:59.357060 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.357047 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="storage-initializer" Apr 22 20:34:59.357105 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.357062 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="storage-initializer" Apr 22 20:34:59.357105 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.357071 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" Apr 22 20:34:59.357105 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.357077 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" Apr 22 20:34:59.357215 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.357117 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4e0964f-cd80-4657-a859-18c446344122" containerName="kserve-container" Apr 22 20:34:59.359812 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.359798 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:34:59.367788 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.367765 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d"] Apr 22 20:34:59.485722 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.485700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b84a4da-44a5-4ad5-9fde-0514d8be28e5-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d\" (UID: \"6b84a4da-44a5-4ad5-9fde-0514d8be28e5\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:34:59.586478 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.586413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b84a4da-44a5-4ad5-9fde-0514d8be28e5-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d\" (UID: \"6b84a4da-44a5-4ad5-9fde-0514d8be28e5\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:34:59.586706 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.586692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b84a4da-44a5-4ad5-9fde-0514d8be28e5-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d\" (UID: \"6b84a4da-44a5-4ad5-9fde-0514d8be28e5\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:34:59.669566 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.669551 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:34:59.690348 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.690317 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.24:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 20:34:59.786467 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:34:59.786421 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d"] Apr 22 20:34:59.790326 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:34:59.790290 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b84a4da_44a5_4ad5_9fde_0514d8be28e5.slice/crio-3d127600854fdf561ba6d11cad3bf2599f62e91bdff4482e72adbba51539ab18 WatchSource:0}: Error finding container 3d127600854fdf561ba6d11cad3bf2599f62e91bdff4482e72adbba51539ab18: Status 404 returned error can't find the container with id 3d127600854fdf561ba6d11cad3bf2599f62e91bdff4482e72adbba51539ab18 Apr 22 20:35:00.483290 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:00.483257 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" event={"ID":"6b84a4da-44a5-4ad5-9fde-0514d8be28e5","Type":"ContainerStarted","Data":"bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d"} Apr 22 20:35:00.483290 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:00.483293 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" event={"ID":"6b84a4da-44a5-4ad5-9fde-0514d8be28e5","Type":"ContainerStarted","Data":"3d127600854fdf561ba6d11cad3bf2599f62e91bdff4482e72adbba51539ab18"} Apr 22 20:35:03.491636 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:03.491603 2579 generic.go:358] "Generic (PLEG): container finished" podID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerID="bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d" exitCode=0 Apr 22 20:35:03.491981 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:03.491679 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" event={"ID":"6b84a4da-44a5-4ad5-9fde-0514d8be28e5","Type":"ContainerDied","Data":"bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d"} Apr 22 20:35:03.716562 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:03.716543 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:35:03.812748 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:03.812719 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a70dadb8-a1d8-4de0-b677-809132ecfbee-kserve-provision-location\") pod \"a70dadb8-a1d8-4de0-b677-809132ecfbee\" (UID: \"a70dadb8-a1d8-4de0-b677-809132ecfbee\") " Apr 22 20:35:03.813037 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:03.813013 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a70dadb8-a1d8-4de0-b677-809132ecfbee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a70dadb8-a1d8-4de0-b677-809132ecfbee" (UID: "a70dadb8-a1d8-4de0-b677-809132ecfbee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:35:03.913341 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:03.913283 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a70dadb8-a1d8-4de0-b677-809132ecfbee-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:35:04.495152 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.495102 2579 generic.go:358] "Generic (PLEG): container finished" podID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerID="b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a" exitCode=0 Apr 22 20:35:04.495603 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.495183 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" Apr 22 20:35:04.495603 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.495182 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" event={"ID":"a70dadb8-a1d8-4de0-b677-809132ecfbee","Type":"ContainerDied","Data":"b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a"} Apr 22 20:35:04.495603 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.495293 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g" event={"ID":"a70dadb8-a1d8-4de0-b677-809132ecfbee","Type":"ContainerDied","Data":"13dd75d78bf48a6a9c6e3de6700beeeac41e9255821b656cf055a86044936764"} Apr 22 20:35:04.495603 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.495318 2579 scope.go:117] "RemoveContainer" containerID="b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a" Apr 22 20:35:04.496991 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.496970 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" event={"ID":"6b84a4da-44a5-4ad5-9fde-0514d8be28e5","Type":"ContainerStarted","Data":"c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909"} Apr 22 20:35:04.497194 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.497178 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:35:04.503413 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.503399 2579 scope.go:117] "RemoveContainer" containerID="1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d" Apr 22 20:35:04.509939 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.509922 2579 scope.go:117] "RemoveContainer" containerID="b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a" Apr 22 20:35:04.510226 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:35:04.510206 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a\": container with ID starting with b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a not found: ID does not exist" containerID="b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a" Apr 22 20:35:04.510274 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.510234 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a"} err="failed to get container status \"b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a\": rpc error: code = NotFound desc = could not find container \"b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a\": container with ID starting with b6490dc5e053272593d77247f6c3d6f076c13ad2e520281e5b7bc5f9c4e8786a not found: ID does not exist" Apr 22 20:35:04.510274 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.510250 2579 scope.go:117] "RemoveContainer" containerID="1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d" Apr 22 20:35:04.510481 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:35:04.510465 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d\": container with ID starting with 1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d not found: ID does not exist" containerID="1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d" Apr 22 20:35:04.510525 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.510487 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d"} err="failed to get container status \"1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d\": rpc error: code = NotFound desc = could not find container \"1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d\": container with ID starting with 1edc02dc12760b278d9051126f99e4628d4ebbe11b5718fbf76b539cba7ec55d not found: ID does not exist" Apr 22 20:35:04.514071 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.514016 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" podStartSLOduration=5.5140055409999995 podStartE2EDuration="5.514005541s" podCreationTimestamp="2026-04-22 20:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:35:04.513434071 +0000 UTC m=+2210.467796873" watchObservedRunningTime="2026-04-22 20:35:04.514005541 +0000 UTC m=+2210.468368341" Apr 22 20:35:04.524414 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.524393 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g"] Apr 22 20:35:04.528230 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.528210 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-pkg2g"] Apr 22 20:35:04.696326 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:04.696295 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" path="/var/lib/kubelet/pods/a70dadb8-a1d8-4de0-b677-809132ecfbee/volumes" Apr 22 20:35:35.503204 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:35.503085 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 20:35:45.501491 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:45.501445 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 20:35:55.502076 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:35:55.502034 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 20:36:05.502258 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:05.502213 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 20:36:05.689042 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:05.689006 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.25:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 20:36:15.692881 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:15.692854 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:36:19.442726 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.442692 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d"] Apr 22 20:36:19.443203 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.443036 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" containerID="cri-o://c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909" gracePeriod=30 Apr 22 20:36:19.486106 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.486078 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw"] Apr 22 20:36:19.486425 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.486409 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="storage-initializer" Apr 22 20:36:19.486496 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.486428 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="storage-initializer" Apr 22 20:36:19.486496 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.486449 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" Apr 22 20:36:19.486496 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.486459 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" Apr 22 20:36:19.486637 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.486525 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a70dadb8-a1d8-4de0-b677-809132ecfbee" containerName="kserve-container" Apr 22 20:36:19.489330 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.489313 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:36:19.498531 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.498508 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw"] Apr 22 20:36:19.616921 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.616897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/915fd165-8a7d-4c3e-b13c-1b394c3dd163-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw\" (UID: \"915fd165-8a7d-4c3e-b13c-1b394c3dd163\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:36:19.717990 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.717919 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/915fd165-8a7d-4c3e-b13c-1b394c3dd163-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw\" (UID: \"915fd165-8a7d-4c3e-b13c-1b394c3dd163\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:36:19.718321 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.718304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/915fd165-8a7d-4c3e-b13c-1b394c3dd163-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw\" (UID: \"915fd165-8a7d-4c3e-b13c-1b394c3dd163\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:36:19.798811 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.798783 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:36:19.920619 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.920591 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw"] Apr 22 20:36:19.923362 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:36:19.923334 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod915fd165_8a7d_4c3e_b13c_1b394c3dd163.slice/crio-81f01951de38bf1af379435827e28b8bfb6d2a2dae03a0fe4bf48f964f630629 WatchSource:0}: Error finding container 81f01951de38bf1af379435827e28b8bfb6d2a2dae03a0fe4bf48f964f630629: Status 404 returned error can't find the container with id 81f01951de38bf1af379435827e28b8bfb6d2a2dae03a0fe4bf48f964f630629 Apr 22 20:36:19.925096 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:19.925081 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:36:20.707689 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:20.707653 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" event={"ID":"915fd165-8a7d-4c3e-b13c-1b394c3dd163","Type":"ContainerStarted","Data":"e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9"} Apr 22 20:36:20.707689 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:20.707686 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" event={"ID":"915fd165-8a7d-4c3e-b13c-1b394c3dd163","Type":"ContainerStarted","Data":"81f01951de38bf1af379435827e28b8bfb6d2a2dae03a0fe4bf48f964f630629"} Apr 22 20:36:23.482969 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.482944 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:36:23.545835 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.545812 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b84a4da-44a5-4ad5-9fde-0514d8be28e5-kserve-provision-location\") pod \"6b84a4da-44a5-4ad5-9fde-0514d8be28e5\" (UID: \"6b84a4da-44a5-4ad5-9fde-0514d8be28e5\") " Apr 22 20:36:23.546108 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.546087 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b84a4da-44a5-4ad5-9fde-0514d8be28e5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6b84a4da-44a5-4ad5-9fde-0514d8be28e5" (UID: "6b84a4da-44a5-4ad5-9fde-0514d8be28e5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:36:23.647094 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.647064 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b84a4da-44a5-4ad5-9fde-0514d8be28e5-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:36:23.721227 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.721200 2579 generic.go:358] "Generic (PLEG): container finished" podID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerID="e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9" exitCode=0 Apr 22 20:36:23.721336 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.721288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" event={"ID":"915fd165-8a7d-4c3e-b13c-1b394c3dd163","Type":"ContainerDied","Data":"e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9"} Apr 22 20:36:23.722839 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.722811 2579 generic.go:358] "Generic (PLEG): container finished" podID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerID="c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909" exitCode=0 Apr 22 20:36:23.722949 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.722918 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" event={"ID":"6b84a4da-44a5-4ad5-9fde-0514d8be28e5","Type":"ContainerDied","Data":"c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909"} Apr 22 20:36:23.723003 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.722958 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" Apr 22 20:36:23.723003 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.722970 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d" event={"ID":"6b84a4da-44a5-4ad5-9fde-0514d8be28e5","Type":"ContainerDied","Data":"3d127600854fdf561ba6d11cad3bf2599f62e91bdff4482e72adbba51539ab18"} Apr 22 20:36:23.723003 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.722998 2579 scope.go:117] "RemoveContainer" containerID="c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909" Apr 22 20:36:23.730413 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.730338 2579 scope.go:117] "RemoveContainer" containerID="bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d" Apr 22 20:36:23.738067 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.738043 2579 scope.go:117] "RemoveContainer" containerID="c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909" Apr 22 20:36:23.738341 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:36:23.738323 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909\": container with ID starting with c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909 not found: ID does not exist" containerID="c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909" Apr 22 20:36:23.738406 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.738348 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909"} err="failed to get container status \"c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909\": rpc error: code = NotFound desc = could not find container \"c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909\": container with ID starting with c771496f9640729f76d65532daa59038febc6e695db05809996c4574e9032909 not found: ID does not exist" Apr 22 20:36:23.738406 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.738363 2579 scope.go:117] "RemoveContainer" containerID="bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d" Apr 22 20:36:23.738649 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:36:23.738634 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d\": container with ID starting with bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d not found: ID does not exist" containerID="bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d" Apr 22 20:36:23.738700 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.738653 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d"} err="failed to get container status \"bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d\": rpc error: code = NotFound desc = could not find container \"bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d\": container with ID starting with bb3aa279872b04f5bcbd2206358b77e79d8140d58d4648d2f86b5b136405824d not found: ID does not exist" Apr 22 20:36:23.745733 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.745703 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d"] Apr 22 20:36:23.747561 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:23.747543 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-srl5d"] Apr 22 20:36:24.692420 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:24.692386 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" path="/var/lib/kubelet/pods/6b84a4da-44a5-4ad5-9fde-0514d8be28e5/volumes" Apr 22 20:36:24.727546 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:24.727514 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" event={"ID":"915fd165-8a7d-4c3e-b13c-1b394c3dd163","Type":"ContainerStarted","Data":"25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031"} Apr 22 20:36:24.727745 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:24.727729 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:36:24.743424 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:24.743384 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" podStartSLOduration=5.7433715979999995 podStartE2EDuration="5.743371598s" podCreationTimestamp="2026-04-22 20:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:36:24.742001569 +0000 UTC m=+2290.696364370" watchObservedRunningTime="2026-04-22 20:36:24.743371598 +0000 UTC m=+2290.697734399" Apr 22 20:36:55.731678 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:36:55.731632 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 20:37:05.730412 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:05.730322 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 20:37:15.730445 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:15.730407 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 20:37:25.730937 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:25.730894 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 20:37:31.689199 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:31.689160 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 20:37:41.693154 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:41.693106 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:37:49.633402 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:49.633372 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw"] Apr 22 20:37:49.633769 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:49.633630 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" containerID="cri-o://25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031" gracePeriod=30 Apr 22 20:37:51.689708 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:51.689667 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 20:37:54.772358 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.772338 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:37:54.839230 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.839203 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/915fd165-8a7d-4c3e-b13c-1b394c3dd163-kserve-provision-location\") pod \"915fd165-8a7d-4c3e-b13c-1b394c3dd163\" (UID: \"915fd165-8a7d-4c3e-b13c-1b394c3dd163\") " Apr 22 20:37:54.839518 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.839493 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/915fd165-8a7d-4c3e-b13c-1b394c3dd163-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "915fd165-8a7d-4c3e-b13c-1b394c3dd163" (UID: "915fd165-8a7d-4c3e-b13c-1b394c3dd163"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:37:54.940226 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.940172 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/915fd165-8a7d-4c3e-b13c-1b394c3dd163-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:37:54.952483 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.952453 2579 generic.go:358] "Generic (PLEG): container finished" podID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerID="25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031" exitCode=0 Apr 22 20:37:54.952597 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.952517 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" event={"ID":"915fd165-8a7d-4c3e-b13c-1b394c3dd163","Type":"ContainerDied","Data":"25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031"} Apr 22 20:37:54.952597 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.952523 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" Apr 22 20:37:54.952597 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.952551 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw" event={"ID":"915fd165-8a7d-4c3e-b13c-1b394c3dd163","Type":"ContainerDied","Data":"81f01951de38bf1af379435827e28b8bfb6d2a2dae03a0fe4bf48f964f630629"} Apr 22 20:37:54.952597 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.952572 2579 scope.go:117] "RemoveContainer" containerID="25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031" Apr 22 20:37:54.960945 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.960931 2579 scope.go:117] "RemoveContainer" containerID="e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9" Apr 22 20:37:54.967336 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.967318 2579 scope.go:117] "RemoveContainer" containerID="25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031" Apr 22 20:37:54.967584 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:37:54.967567 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031\": container with ID starting with 25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031 not found: ID does not exist" containerID="25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031" Apr 22 20:37:54.967625 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.967591 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031"} err="failed to get container status \"25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031\": rpc error: code = NotFound desc = could not find container \"25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031\": container with ID starting with 25929b0fb4abdc4725d09c922f4b36e90d3fece87848d489ca2063047d2ba031 not found: ID does not exist" Apr 22 20:37:54.967625 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.967608 2579 scope.go:117] "RemoveContainer" containerID="e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9" Apr 22 20:37:54.967840 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:37:54.967822 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9\": container with ID starting with e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9 not found: ID does not exist" containerID="e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9" Apr 22 20:37:54.967876 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.967851 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9"} err="failed to get container status \"e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9\": rpc error: code = NotFound desc = could not find container \"e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9\": container with ID starting with e908496e8bd54b502ea10891d68d10e872b4dc053fbb1470f6c7e3cd4c16f8c9 not found: ID does not exist" Apr 22 20:37:54.973906 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.973886 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw"] Apr 22 20:37:54.977103 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:54.977083 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-s2kjw"] Apr 22 20:37:56.692969 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:37:56.692936 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" path="/var/lib/kubelet/pods/915fd165-8a7d-4c3e-b13c-1b394c3dd163/volumes" Apr 22 20:44:09.817844 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.817810 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r"] Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818056 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="storage-initializer" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818068 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="storage-initializer" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818079 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818085 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818091 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="storage-initializer" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818097 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="storage-initializer" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818108 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818113 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818180 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="915fd165-8a7d-4c3e-b13c-1b394c3dd163" containerName="kserve-container" Apr 22 20:44:09.818316 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.818195 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b84a4da-44a5-4ad5-9fde-0514d8be28e5" containerName="kserve-container" Apr 22 20:44:09.823057 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.821953 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:44:09.824437 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.824413 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:44:09.831955 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.831934 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r"] Apr 22 20:44:09.956895 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:09.956863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec1bae6-9d01-4005-9a14-889e3d94ef4e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-jwz8r\" (UID: \"2ec1bae6-9d01-4005-9a14-889e3d94ef4e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:44:10.057988 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:10.057961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec1bae6-9d01-4005-9a14-889e3d94ef4e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-jwz8r\" (UID: \"2ec1bae6-9d01-4005-9a14-889e3d94ef4e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:44:10.058321 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:10.058303 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec1bae6-9d01-4005-9a14-889e3d94ef4e-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-jwz8r\" (UID: \"2ec1bae6-9d01-4005-9a14-889e3d94ef4e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:44:10.134040 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:10.134018 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:44:10.253986 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:10.253943 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r"] Apr 22 20:44:10.256854 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:44:10.256825 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ec1bae6_9d01_4005_9a14_889e3d94ef4e.slice/crio-294401de90658673452a8b8ac9f75fc690c2cbce5bfb8c9ffb4fd3d184333cf7 WatchSource:0}: Error finding container 294401de90658673452a8b8ac9f75fc690c2cbce5bfb8c9ffb4fd3d184333cf7: Status 404 returned error can't find the container with id 294401de90658673452a8b8ac9f75fc690c2cbce5bfb8c9ffb4fd3d184333cf7 Apr 22 20:44:10.258581 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:10.258566 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:44:10.883259 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:10.883224 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" event={"ID":"2ec1bae6-9d01-4005-9a14-889e3d94ef4e","Type":"ContainerStarted","Data":"ef9502fba91c1925e0a375e75eeecd0db7e681c53318f951db29cefad9640b84"} Apr 22 20:44:10.883259 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:10.883260 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" event={"ID":"2ec1bae6-9d01-4005-9a14-889e3d94ef4e","Type":"ContainerStarted","Data":"294401de90658673452a8b8ac9f75fc690c2cbce5bfb8c9ffb4fd3d184333cf7"} Apr 22 20:44:15.897768 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:15.897731 2579 generic.go:358] "Generic (PLEG): container finished" podID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerID="ef9502fba91c1925e0a375e75eeecd0db7e681c53318f951db29cefad9640b84" exitCode=0 Apr 22 20:44:15.898180 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:15.897816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" event={"ID":"2ec1bae6-9d01-4005-9a14-889e3d94ef4e","Type":"ContainerDied","Data":"ef9502fba91c1925e0a375e75eeecd0db7e681c53318f951db29cefad9640b84"} Apr 22 20:44:20.912328 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:20.912296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" event={"ID":"2ec1bae6-9d01-4005-9a14-889e3d94ef4e","Type":"ContainerStarted","Data":"bdf627e705f465f26518dc43978a6467853656e051a8b5d43b908bea35262de7"} Apr 22 20:44:20.912839 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:20.912576 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:44:20.914005 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:20.913979 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 22 20:44:20.940496 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:20.940448 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" podStartSLOduration=7.948977566 podStartE2EDuration="11.940435745s" podCreationTimestamp="2026-04-22 20:44:09 +0000 UTC" firstStartedPulling="2026-04-22 20:44:15.898952052 +0000 UTC m=+2761.853314833" lastFinishedPulling="2026-04-22 20:44:19.890410231 +0000 UTC m=+2765.844773012" observedRunningTime="2026-04-22 20:44:20.93781152 +0000 UTC m=+2766.892174321" watchObservedRunningTime="2026-04-22 20:44:20.940435745 +0000 UTC m=+2766.894798546" Apr 22 20:44:21.914988 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:21.914947 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 22 20:44:31.916438 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:31.916408 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:44:49.851597 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:49.851519 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r"] Apr 22 20:44:49.851972 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:49.851833 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerName="kserve-container" containerID="cri-o://bdf627e705f465f26518dc43978a6467853656e051a8b5d43b908bea35262de7" gracePeriod=30 Apr 22 20:44:49.912506 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:49.912472 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk"] Apr 22 20:44:49.915627 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:49.915612 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:44:49.924942 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:49.924913 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk"] Apr 22 20:44:49.934596 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:49.934537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b133e2d-9b43-40a8-9710-670b7f38a9da-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk\" (UID: \"3b133e2d-9b43-40a8-9710-670b7f38a9da\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:44:50.034932 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:50.034907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b133e2d-9b43-40a8-9710-670b7f38a9da-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk\" (UID: \"3b133e2d-9b43-40a8-9710-670b7f38a9da\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:44:50.035289 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:50.035273 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b133e2d-9b43-40a8-9710-670b7f38a9da-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk\" (UID: \"3b133e2d-9b43-40a8-9710-670b7f38a9da\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:44:50.226111 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:50.226073 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:44:50.342454 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:50.342433 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk"] Apr 22 20:44:50.344317 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:44:50.344287 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b133e2d_9b43_40a8_9710_670b7f38a9da.slice/crio-8de43c610368621071e44e1d6f847310dfbd10f534897d8f2056df64ac0c8cd3 WatchSource:0}: Error finding container 8de43c610368621071e44e1d6f847310dfbd10f534897d8f2056df64ac0c8cd3: Status 404 returned error can't find the container with id 8de43c610368621071e44e1d6f847310dfbd10f534897d8f2056df64ac0c8cd3 Apr 22 20:44:50.993671 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:50.993640 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" event={"ID":"3b133e2d-9b43-40a8-9710-670b7f38a9da","Type":"ContainerStarted","Data":"c347f5b354bc93e1dc672cecd57d3e7c945fb4e440ef26530216c85b0b691c72"} Apr 22 20:44:50.993671 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:50.993674 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" event={"ID":"3b133e2d-9b43-40a8-9710-670b7f38a9da","Type":"ContainerStarted","Data":"8de43c610368621071e44e1d6f847310dfbd10f534897d8f2056df64ac0c8cd3"} Apr 22 20:44:55.004509 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:55.004433 2579 generic.go:358] "Generic (PLEG): container finished" podID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerID="c347f5b354bc93e1dc672cecd57d3e7c945fb4e440ef26530216c85b0b691c72" exitCode=0 Apr 22 20:44:55.004509 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:55.004492 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" event={"ID":"3b133e2d-9b43-40a8-9710-670b7f38a9da","Type":"ContainerDied","Data":"c347f5b354bc93e1dc672cecd57d3e7c945fb4e440ef26530216c85b0b691c72"} Apr 22 20:44:56.008923 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:56.008889 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" event={"ID":"3b133e2d-9b43-40a8-9710-670b7f38a9da","Type":"ContainerStarted","Data":"ecf1aa48012337d810f0f4da360ca6ab69411eb5b81172280bb17b5085d674fb"} Apr 22 20:44:56.009315 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:56.009167 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:44:56.010356 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:56.010334 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 22 20:44:56.025501 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:56.025452 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" podStartSLOduration=7.025435537 podStartE2EDuration="7.025435537s" podCreationTimestamp="2026-04-22 20:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:44:56.024067488 +0000 UTC m=+2801.978430292" watchObservedRunningTime="2026-04-22 20:44:56.025435537 +0000 UTC m=+2801.979798343" Apr 22 20:44:57.011864 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:44:57.011832 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 22 20:45:07.012813 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:07.012782 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:45:19.759577 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:19.759546 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk"] Apr 22 20:45:19.759947 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:19.759829 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerName="kserve-container" containerID="cri-o://ecf1aa48012337d810f0f4da360ca6ab69411eb5b81172280bb17b5085d674fb" gracePeriod=30 Apr 22 20:45:19.806755 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:19.806723 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx"] Apr 22 20:45:19.812198 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:19.812178 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:45:19.816427 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:19.816398 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx"] Apr 22 20:45:19.843315 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:19.843289 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a616020-c007-4602-97bf-6960e5b370a5-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-pvkhx\" (UID: \"4a616020-c007-4602-97bf-6960e5b370a5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:45:19.943841 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:19.943814 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a616020-c007-4602-97bf-6960e5b370a5-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-pvkhx\" (UID: \"4a616020-c007-4602-97bf-6960e5b370a5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:45:19.944236 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:19.944219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a616020-c007-4602-97bf-6960e5b370a5-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-pvkhx\" (UID: \"4a616020-c007-4602-97bf-6960e5b370a5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:45:20.073570 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:20.073481 2579 generic.go:358] "Generic (PLEG): container finished" podID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerID="bdf627e705f465f26518dc43978a6467853656e051a8b5d43b908bea35262de7" exitCode=137 Apr 22 20:45:20.073716 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:20.073558 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" event={"ID":"2ec1bae6-9d01-4005-9a14-889e3d94ef4e","Type":"ContainerDied","Data":"bdf627e705f465f26518dc43978a6467853656e051a8b5d43b908bea35262de7"} Apr 22 20:45:20.122271 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:20.122237 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:45:20.240693 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:20.240669 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx"] Apr 22 20:45:20.243110 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:45:20.243082 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a616020_c007_4602_97bf_6960e5b370a5.slice/crio-1e5a34469b8535c879e648af83ebb99904d42c97ea2deb58ee4c052516d7c412 WatchSource:0}: Error finding container 1e5a34469b8535c879e648af83ebb99904d42c97ea2deb58ee4c052516d7c412: Status 404 returned error can't find the container with id 1e5a34469b8535c879e648af83ebb99904d42c97ea2deb58ee4c052516d7c412 Apr 22 20:45:20.491017 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:20.490995 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:45:20.549211 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:20.549183 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec1bae6-9d01-4005-9a14-889e3d94ef4e-kserve-provision-location\") pod \"2ec1bae6-9d01-4005-9a14-889e3d94ef4e\" (UID: \"2ec1bae6-9d01-4005-9a14-889e3d94ef4e\") " Apr 22 20:45:20.565998 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:20.565964 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ec1bae6-9d01-4005-9a14-889e3d94ef4e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ec1bae6-9d01-4005-9a14-889e3d94ef4e" (UID: "2ec1bae6-9d01-4005-9a14-889e3d94ef4e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:45:20.649833 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:20.649746 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ec1bae6-9d01-4005-9a14-889e3d94ef4e-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:45:21.077819 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:21.077715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" event={"ID":"4a616020-c007-4602-97bf-6960e5b370a5","Type":"ContainerStarted","Data":"96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a"} Apr 22 20:45:21.077819 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:21.077765 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" event={"ID":"4a616020-c007-4602-97bf-6960e5b370a5","Type":"ContainerStarted","Data":"1e5a34469b8535c879e648af83ebb99904d42c97ea2deb58ee4c052516d7c412"} Apr 22 20:45:21.079175 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:21.079152 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" event={"ID":"2ec1bae6-9d01-4005-9a14-889e3d94ef4e","Type":"ContainerDied","Data":"294401de90658673452a8b8ac9f75fc690c2cbce5bfb8c9ffb4fd3d184333cf7"} Apr 22 20:45:21.079266 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:21.079190 2579 scope.go:117] "RemoveContainer" containerID="bdf627e705f465f26518dc43978a6467853656e051a8b5d43b908bea35262de7" Apr 22 20:45:21.079266 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:21.079160 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r" Apr 22 20:45:21.086599 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:21.086581 2579 scope.go:117] "RemoveContainer" containerID="ef9502fba91c1925e0a375e75eeecd0db7e681c53318f951db29cefad9640b84" Apr 22 20:45:21.106636 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:21.106613 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r"] Apr 22 20:45:21.109190 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:21.109171 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-jwz8r"] Apr 22 20:45:22.692836 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:22.692805 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" path="/var/lib/kubelet/pods/2ec1bae6-9d01-4005-9a14-889e3d94ef4e/volumes" Apr 22 20:45:25.094422 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:25.094388 2579 generic.go:358] "Generic (PLEG): container finished" podID="4a616020-c007-4602-97bf-6960e5b370a5" containerID="96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a" exitCode=0 Apr 22 20:45:25.094788 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:25.094434 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" event={"ID":"4a616020-c007-4602-97bf-6960e5b370a5","Type":"ContainerDied","Data":"96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a"} Apr 22 20:45:50.193308 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:50.193254 2579 generic.go:358] "Generic (PLEG): container finished" podID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerID="ecf1aa48012337d810f0f4da360ca6ab69411eb5b81172280bb17b5085d674fb" exitCode=137 Apr 22 20:45:50.193768 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:50.193345 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" event={"ID":"3b133e2d-9b43-40a8-9710-670b7f38a9da","Type":"ContainerDied","Data":"ecf1aa48012337d810f0f4da360ca6ab69411eb5b81172280bb17b5085d674fb"} Apr 22 20:45:50.456344 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:50.456287 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:45:50.601755 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:50.601720 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b133e2d-9b43-40a8-9710-670b7f38a9da-kserve-provision-location\") pod \"3b133e2d-9b43-40a8-9710-670b7f38a9da\" (UID: \"3b133e2d-9b43-40a8-9710-670b7f38a9da\") " Apr 22 20:45:50.620504 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:50.620465 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b133e2d-9b43-40a8-9710-670b7f38a9da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3b133e2d-9b43-40a8-9710-670b7f38a9da" (UID: "3b133e2d-9b43-40a8-9710-670b7f38a9da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:45:50.702321 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:50.702290 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b133e2d-9b43-40a8-9710-670b7f38a9da-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:45:51.198501 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:51.198464 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" event={"ID":"3b133e2d-9b43-40a8-9710-670b7f38a9da","Type":"ContainerDied","Data":"8de43c610368621071e44e1d6f847310dfbd10f534897d8f2056df64ac0c8cd3"} Apr 22 20:45:51.198955 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:51.198517 2579 scope.go:117] "RemoveContainer" containerID="ecf1aa48012337d810f0f4da360ca6ab69411eb5b81172280bb17b5085d674fb" Apr 22 20:45:51.198955 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:51.198650 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk" Apr 22 20:45:51.208742 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:51.208634 2579 scope.go:117] "RemoveContainer" containerID="c347f5b354bc93e1dc672cecd57d3e7c945fb4e440ef26530216c85b0b691c72" Apr 22 20:45:51.215043 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:51.214999 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk"] Apr 22 20:45:51.216222 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:51.216193 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-7vzjk"] Apr 22 20:45:52.693665 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:45:52.693628 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" path="/var/lib/kubelet/pods/3b133e2d-9b43-40a8-9710-670b7f38a9da/volumes" Apr 22 20:47:21.457808 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:21.457768 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" event={"ID":"4a616020-c007-4602-97bf-6960e5b370a5","Type":"ContainerStarted","Data":"4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d"} Apr 22 20:47:21.458246 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:21.457954 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:47:21.459237 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:21.459209 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" podUID="4a616020-c007-4602-97bf-6960e5b370a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 20:47:21.479293 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:21.479253 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" podStartSLOduration=7.023967515 podStartE2EDuration="2m2.479240815s" podCreationTimestamp="2026-04-22 20:45:19 +0000 UTC" firstStartedPulling="2026-04-22 20:45:25.095476432 +0000 UTC m=+2831.049839210" lastFinishedPulling="2026-04-22 20:47:20.550749731 +0000 UTC m=+2946.505112510" observedRunningTime="2026-04-22 20:47:21.478243412 +0000 UTC m=+2947.432606214" watchObservedRunningTime="2026-04-22 20:47:21.479240815 +0000 UTC m=+2947.433603616" Apr 22 20:47:22.461339 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:22.461303 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" podUID="4a616020-c007-4602-97bf-6960e5b370a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 22 20:47:32.462533 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:32.462497 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:47:42.094665 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.094629 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx"] Apr 22 20:47:42.095306 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.094897 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" podUID="4a616020-c007-4602-97bf-6960e5b370a5" containerName="kserve-container" containerID="cri-o://4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d" gracePeriod=30 Apr 22 20:47:42.180541 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180508 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758"] Apr 22 20:47:42.180812 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180778 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerName="storage-initializer" Apr 22 20:47:42.180812 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180792 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerName="storage-initializer" Apr 22 20:47:42.180812 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180803 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerName="kserve-container" Apr 22 20:47:42.180812 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180808 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerName="kserve-container" Apr 22 20:47:42.180944 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180816 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerName="storage-initializer" Apr 22 20:47:42.180944 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180821 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerName="storage-initializer" Apr 22 20:47:42.180944 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180834 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerName="kserve-container" Apr 22 20:47:42.180944 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180839 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerName="kserve-container" Apr 22 20:47:42.180944 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180877 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b133e2d-9b43-40a8-9710-670b7f38a9da" containerName="kserve-container" Apr 22 20:47:42.180944 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.180885 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ec1bae6-9d01-4005-9a14-889e3d94ef4e" containerName="kserve-container" Apr 22 20:47:42.212821 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.212797 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758"] Apr 22 20:47:42.212956 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.212903 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:47:42.268674 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.268646 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-8f758\" (UID: \"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:47:42.369679 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.369658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-8f758\" (UID: \"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:47:42.369989 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.369974 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-8f758\" (UID: \"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:47:42.522781 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.522754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:47:42.637010 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:42.636949 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758"] Apr 22 20:47:42.640412 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:47:42.640381 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dc7f0b7_cd01_48b3_9707_ee3a3b5ff192.slice/crio-7bc263b2ccb363d8d84c0a745373250c670730dcb96a09b885b8229eb466cebd WatchSource:0}: Error finding container 7bc263b2ccb363d8d84c0a745373250c670730dcb96a09b885b8229eb466cebd: Status 404 returned error can't find the container with id 7bc263b2ccb363d8d84c0a745373250c670730dcb96a09b885b8229eb466cebd Apr 22 20:47:43.516695 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:43.516664 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" event={"ID":"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192","Type":"ContainerStarted","Data":"6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe"} Apr 22 20:47:43.516695 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:43.516697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" event={"ID":"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192","Type":"ContainerStarted","Data":"7bc263b2ccb363d8d84c0a745373250c670730dcb96a09b885b8229eb466cebd"} Apr 22 20:47:44.352444 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.352423 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:47:44.486344 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.486268 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a616020-c007-4602-97bf-6960e5b370a5-kserve-provision-location\") pod \"4a616020-c007-4602-97bf-6960e5b370a5\" (UID: \"4a616020-c007-4602-97bf-6960e5b370a5\") " Apr 22 20:47:44.486672 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.486647 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a616020-c007-4602-97bf-6960e5b370a5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a616020-c007-4602-97bf-6960e5b370a5" (UID: "4a616020-c007-4602-97bf-6960e5b370a5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:47:44.520462 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.520434 2579 generic.go:358] "Generic (PLEG): container finished" podID="4a616020-c007-4602-97bf-6960e5b370a5" containerID="4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d" exitCode=0 Apr 22 20:47:44.520816 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.520507 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" Apr 22 20:47:44.520816 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.520521 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" event={"ID":"4a616020-c007-4602-97bf-6960e5b370a5","Type":"ContainerDied","Data":"4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d"} Apr 22 20:47:44.520816 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.520561 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx" event={"ID":"4a616020-c007-4602-97bf-6960e5b370a5","Type":"ContainerDied","Data":"1e5a34469b8535c879e648af83ebb99904d42c97ea2deb58ee4c052516d7c412"} Apr 22 20:47:44.520816 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.520581 2579 scope.go:117] "RemoveContainer" containerID="4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d" Apr 22 20:47:44.528050 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.528033 2579 scope.go:117] "RemoveContainer" containerID="96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a" Apr 22 20:47:44.534987 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.534967 2579 scope.go:117] "RemoveContainer" containerID="4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d" Apr 22 20:47:44.535269 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:47:44.535252 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d\": container with ID starting with 4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d not found: ID does not exist" containerID="4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d" Apr 22 20:47:44.535342 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.535277 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d"} err="failed to get container status \"4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d\": rpc error: code = NotFound desc = could not find container \"4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d\": container with ID starting with 4d410a4cd068b1d2383a03bd03465c92a48c230111665ece3014f3bc479ea85d not found: ID does not exist" Apr 22 20:47:44.535342 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.535293 2579 scope.go:117] "RemoveContainer" containerID="96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a" Apr 22 20:47:44.535536 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:47:44.535519 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a\": container with ID starting with 96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a not found: ID does not exist" containerID="96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a" Apr 22 20:47:44.535581 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.535540 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a"} err="failed to get container status \"96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a\": rpc error: code = NotFound desc = could not find container \"96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a\": container with ID starting with 96d7a0556c7f1ffd24fc1054d04286fd99e8613c7a18ad0fe6cff074f586524a not found: ID does not exist" Apr 22 20:47:44.541403 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.541373 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx"] Apr 22 20:47:44.543019 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.542998 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-pvkhx"] Apr 22 20:47:44.587275 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.587253 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a616020-c007-4602-97bf-6960e5b370a5-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:47:44.691973 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:44.691949 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a616020-c007-4602-97bf-6960e5b370a5" path="/var/lib/kubelet/pods/4a616020-c007-4602-97bf-6960e5b370a5/volumes" Apr 22 20:47:46.527754 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:46.527714 2579 generic.go:358] "Generic (PLEG): container finished" podID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerID="6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe" exitCode=0 Apr 22 20:47:46.528061 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:47:46.527769 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" event={"ID":"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192","Type":"ContainerDied","Data":"6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe"} Apr 22 20:48:06.592210 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:06.592176 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" event={"ID":"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192","Type":"ContainerStarted","Data":"23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c"} Apr 22 20:48:06.592598 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:06.592550 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:48:06.593657 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:06.593632 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 20:48:06.606219 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:06.606181 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podStartSLOduration=5.310614647 podStartE2EDuration="24.606166858s" podCreationTimestamp="2026-04-22 20:47:42 +0000 UTC" firstStartedPulling="2026-04-22 20:47:46.528900265 +0000 UTC m=+2972.483263044" lastFinishedPulling="2026-04-22 20:48:05.824452476 +0000 UTC m=+2991.778815255" observedRunningTime="2026-04-22 20:48:06.605782526 +0000 UTC m=+2992.560145326" watchObservedRunningTime="2026-04-22 20:48:06.606166858 +0000 UTC m=+2992.560529665" Apr 22 20:48:07.595569 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:07.595524 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 20:48:17.596357 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:17.596316 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 20:48:27.596219 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:27.596176 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 20:48:37.596532 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:37.596488 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 20:48:47.595805 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:47.595763 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 20:48:57.595763 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:48:57.595720 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 20:49:07.596496 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:07.596422 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:49:12.275432 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:12.275398 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758"] Apr 22 20:49:12.275931 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:12.275658 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" containerID="cri-o://23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c" gracePeriod=30 Apr 22 20:49:15.505764 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.505738 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:49:15.534466 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.534438 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192-kserve-provision-location\") pod \"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192\" (UID: \"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192\") " Apr 22 20:49:15.534708 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.534685 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" (UID: "2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:49:15.635575 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.635550 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:49:15.767929 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.767897 2579 generic.go:358] "Generic (PLEG): container finished" podID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerID="23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c" exitCode=0 Apr 22 20:49:15.768055 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.767935 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" event={"ID":"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192","Type":"ContainerDied","Data":"23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c"} Apr 22 20:49:15.768055 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.767956 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" event={"ID":"2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192","Type":"ContainerDied","Data":"7bc263b2ccb363d8d84c0a745373250c670730dcb96a09b885b8229eb466cebd"} Apr 22 20:49:15.768055 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.767960 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758" Apr 22 20:49:15.768055 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.767970 2579 scope.go:117] "RemoveContainer" containerID="23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c" Apr 22 20:49:15.775729 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.775624 2579 scope.go:117] "RemoveContainer" containerID="6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe" Apr 22 20:49:15.782659 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.782640 2579 scope.go:117] "RemoveContainer" containerID="23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c" Apr 22 20:49:15.782889 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:49:15.782870 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c\": container with ID starting with 23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c not found: ID does not exist" containerID="23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c" Apr 22 20:49:15.782933 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.782898 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c"} err="failed to get container status \"23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c\": rpc error: code = NotFound desc = could not find container \"23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c\": container with ID starting with 23fe178811b687d2fbcf92b53ad118aa3be77cab422ebe0661d574bfcc01908c not found: ID does not exist" Apr 22 20:49:15.782933 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.782916 2579 scope.go:117] "RemoveContainer" containerID="6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe" Apr 22 20:49:15.783189 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:49:15.783171 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe\": container with ID starting with 6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe not found: ID does not exist" containerID="6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe" Apr 22 20:49:15.783235 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.783196 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe"} err="failed to get container status \"6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe\": rpc error: code = NotFound desc = could not find container \"6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe\": container with ID starting with 6c259eb61d27d1f1e57f7b76c07cf677d01349f7bd3d8f16ffbe7e305e3dabfe not found: ID does not exist" Apr 22 20:49:15.786925 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.786901 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758"] Apr 22 20:49:15.790578 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:15.790559 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-8f758"] Apr 22 20:49:16.692938 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:49:16.692909 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" path="/var/lib/kubelet/pods/2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192/volumes" Apr 22 20:50:32.776678 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776574 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8"] Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776812 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a616020-c007-4602-97bf-6960e5b370a5" containerName="storage-initializer" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776822 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a616020-c007-4602-97bf-6960e5b370a5" containerName="storage-initializer" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776830 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776835 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776844 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a616020-c007-4602-97bf-6960e5b370a5" containerName="kserve-container" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776850 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a616020-c007-4602-97bf-6960e5b370a5" containerName="kserve-container" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776856 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="storage-initializer" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776861 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="storage-initializer" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776897 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a616020-c007-4602-97bf-6960e5b370a5" containerName="kserve-container" Apr 22 20:50:32.777089 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.776904 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2dc7f0b7-cd01-48b3-9707-ee3a3b5ff192" containerName="kserve-container" Apr 22 20:50:32.779862 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.779846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:50:32.781824 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.781803 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:50:32.786676 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.786651 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8"] Apr 22 20:50:32.867275 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.867251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b9e6f6-e700-4913-9bc9-58f3693cf7c9-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-n2dk8\" (UID: \"17b9e6f6-e700-4913-9bc9-58f3693cf7c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:50:32.967970 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.967945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b9e6f6-e700-4913-9bc9-58f3693cf7c9-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-n2dk8\" (UID: \"17b9e6f6-e700-4913-9bc9-58f3693cf7c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:50:32.968262 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:32.968247 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b9e6f6-e700-4913-9bc9-58f3693cf7c9-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-n2dk8\" (UID: \"17b9e6f6-e700-4913-9bc9-58f3693cf7c9\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:50:33.091019 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:33.090965 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:50:33.203803 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:33.203767 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8"] Apr 22 20:50:33.206876 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:50:33.206849 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b9e6f6_e700_4913_9bc9_58f3693cf7c9.slice/crio-d507d1d458b208aa97f05b77212dcd0318eb74915e2593df6f52908672296c35 WatchSource:0}: Error finding container d507d1d458b208aa97f05b77212dcd0318eb74915e2593df6f52908672296c35: Status 404 returned error can't find the container with id d507d1d458b208aa97f05b77212dcd0318eb74915e2593df6f52908672296c35 Apr 22 20:50:33.208710 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:33.208690 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:50:33.969117 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:33.969082 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" event={"ID":"17b9e6f6-e700-4913-9bc9-58f3693cf7c9","Type":"ContainerStarted","Data":"fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c"} Apr 22 20:50:33.969117 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:33.969117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" event={"ID":"17b9e6f6-e700-4913-9bc9-58f3693cf7c9","Type":"ContainerStarted","Data":"d507d1d458b208aa97f05b77212dcd0318eb74915e2593df6f52908672296c35"} Apr 22 20:50:36.977911 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:36.977883 2579 generic.go:358] "Generic (PLEG): container finished" podID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerID="fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c" exitCode=0 Apr 22 20:50:36.978294 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:36.977962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" event={"ID":"17b9e6f6-e700-4913-9bc9-58f3693cf7c9","Type":"ContainerDied","Data":"fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c"} Apr 22 20:50:37.982304 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:37.982271 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" event={"ID":"17b9e6f6-e700-4913-9bc9-58f3693cf7c9","Type":"ContainerStarted","Data":"906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b"} Apr 22 20:50:37.982728 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:37.982644 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:50:37.983619 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:37.983597 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 20:50:37.997202 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:37.997130 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podStartSLOduration=5.997114441 podStartE2EDuration="5.997114441s" podCreationTimestamp="2026-04-22 20:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:50:37.996480197 +0000 UTC m=+3143.950843009" watchObservedRunningTime="2026-04-22 20:50:37.997114441 +0000 UTC m=+3143.951477243" Apr 22 20:50:38.985539 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:38.985499 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 20:50:48.985505 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:48.985452 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 20:50:58.985836 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:50:58.985792 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 20:51:08.985770 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:08.985726 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 20:51:18.985514 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:18.985477 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 20:51:28.985759 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:28.985721 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 20:51:38.986855 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:38.986827 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:51:42.915436 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:42.915405 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8"] Apr 22 20:51:42.915869 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:42.915734 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" containerID="cri-o://906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b" gracePeriod=30 Apr 22 20:51:46.056739 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.056713 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:51:46.165388 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.165355 2579 generic.go:358] "Generic (PLEG): container finished" podID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerID="906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b" exitCode=0 Apr 22 20:51:46.165519 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.165413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" event={"ID":"17b9e6f6-e700-4913-9bc9-58f3693cf7c9","Type":"ContainerDied","Data":"906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b"} Apr 22 20:51:46.165519 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.165426 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" Apr 22 20:51:46.165519 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.165449 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8" event={"ID":"17b9e6f6-e700-4913-9bc9-58f3693cf7c9","Type":"ContainerDied","Data":"d507d1d458b208aa97f05b77212dcd0318eb74915e2593df6f52908672296c35"} Apr 22 20:51:46.165519 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.165467 2579 scope.go:117] "RemoveContainer" containerID="906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b" Apr 22 20:51:46.172705 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.172686 2579 scope.go:117] "RemoveContainer" containerID="fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c" Apr 22 20:51:46.176072 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.176054 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b9e6f6-e700-4913-9bc9-58f3693cf7c9-kserve-provision-location\") pod \"17b9e6f6-e700-4913-9bc9-58f3693cf7c9\" (UID: \"17b9e6f6-e700-4913-9bc9-58f3693cf7c9\") " Apr 22 20:51:46.176470 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.176446 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b9e6f6-e700-4913-9bc9-58f3693cf7c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "17b9e6f6-e700-4913-9bc9-58f3693cf7c9" (UID: "17b9e6f6-e700-4913-9bc9-58f3693cf7c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:51:46.179246 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.179229 2579 scope.go:117] "RemoveContainer" containerID="906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b" Apr 22 20:51:46.179473 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:51:46.179457 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b\": container with ID starting with 906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b not found: ID does not exist" containerID="906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b" Apr 22 20:51:46.179517 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.179480 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b"} err="failed to get container status \"906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b\": rpc error: code = NotFound desc = could not find container \"906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b\": container with ID starting with 906f690ab25a518d8efbc9bcba360e7a314c76fdf0ff2c75139bf5f8ab62f88b not found: ID does not exist" Apr 22 20:51:46.179517 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.179495 2579 scope.go:117] "RemoveContainer" containerID="fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c" Apr 22 20:51:46.179711 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:51:46.179697 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c\": container with ID starting with fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c not found: ID does not exist" containerID="fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c" Apr 22 20:51:46.179756 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.179714 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c"} err="failed to get container status \"fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c\": rpc error: code = NotFound desc = could not find container \"fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c\": container with ID starting with fa5151c66d6386beb68e5a4fc5d2e58d66e5aff9ffd2993beb91c6e5e0121b8c not found: ID does not exist" Apr 22 20:51:46.276741 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.276706 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b9e6f6-e700-4913-9bc9-58f3693cf7c9-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:51:46.484742 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.484713 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8"] Apr 22 20:51:46.487604 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.487585 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-n2dk8"] Apr 22 20:51:46.692338 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:51:46.692296 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" path="/var/lib/kubelet/pods/17b9e6f6-e700-4913-9bc9-58f3693cf7c9/volumes" Apr 22 20:52:33.101046 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.100965 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj"] Apr 22 20:52:33.101532 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.101257 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="storage-initializer" Apr 22 20:52:33.101532 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.101270 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="storage-initializer" Apr 22 20:52:33.101532 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.101280 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" Apr 22 20:52:33.101532 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.101285 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" Apr 22 20:52:33.101532 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.101329 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="17b9e6f6-e700-4913-9bc9-58f3693cf7c9" containerName="kserve-container" Apr 22 20:52:33.104197 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.104181 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:52:33.106274 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.106253 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:52:33.113118 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.113095 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj"] Apr 22 20:52:33.194902 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.194871 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400d87cb-6f81-444b-a147-9ccf6c6ba26b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-bfqwj\" (UID: \"400d87cb-6f81-444b-a147-9ccf6c6ba26b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:52:33.295763 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.295732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400d87cb-6f81-444b-a147-9ccf6c6ba26b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-bfqwj\" (UID: \"400d87cb-6f81-444b-a147-9ccf6c6ba26b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:52:33.296067 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.296052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400d87cb-6f81-444b-a147-9ccf6c6ba26b-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-bfqwj\" (UID: \"400d87cb-6f81-444b-a147-9ccf6c6ba26b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:52:33.415062 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.415033 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:52:33.527770 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:33.527741 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj"] Apr 22 20:52:33.530886 ip-10-0-141-46 kubenswrapper[2579]: W0422 20:52:33.530845 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400d87cb_6f81_444b_a147_9ccf6c6ba26b.slice/crio-80d145a8c01834aa2127d3f1f2b9a8f12056413ccec083f143a101e5636b7d80 WatchSource:0}: Error finding container 80d145a8c01834aa2127d3f1f2b9a8f12056413ccec083f143a101e5636b7d80: Status 404 returned error can't find the container with id 80d145a8c01834aa2127d3f1f2b9a8f12056413ccec083f143a101e5636b7d80 Apr 22 20:52:34.294632 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:34.294594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" event={"ID":"400d87cb-6f81-444b-a147-9ccf6c6ba26b","Type":"ContainerStarted","Data":"bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5"} Apr 22 20:52:34.294632 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:34.294633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" event={"ID":"400d87cb-6f81-444b-a147-9ccf6c6ba26b","Type":"ContainerStarted","Data":"80d145a8c01834aa2127d3f1f2b9a8f12056413ccec083f143a101e5636b7d80"} Apr 22 20:52:38.311228 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:38.311194 2579 generic.go:358] "Generic (PLEG): container finished" podID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerID="bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5" exitCode=0 Apr 22 20:52:38.311603 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:38.311268 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" event={"ID":"400d87cb-6f81-444b-a147-9ccf6c6ba26b","Type":"ContainerDied","Data":"bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5"} Apr 22 20:52:39.315007 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:39.314973 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" event={"ID":"400d87cb-6f81-444b-a147-9ccf6c6ba26b","Type":"ContainerStarted","Data":"5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4"} Apr 22 20:52:39.315386 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:39.315263 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:52:39.316640 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:39.316615 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 20:52:39.330298 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:39.330257 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podStartSLOduration=6.3302456 podStartE2EDuration="6.3302456s" podCreationTimestamp="2026-04-22 20:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:52:39.329263813 +0000 UTC m=+3265.283626614" watchObservedRunningTime="2026-04-22 20:52:39.3302456 +0000 UTC m=+3265.284608400" Apr 22 20:52:40.317510 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:40.317474 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 20:52:50.317918 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:52:50.317876 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 20:53:00.318216 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:00.318174 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 20:53:10.317739 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:10.317691 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 20:53:20.317619 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:20.317576 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 20:53:30.318251 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:30.318200 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 20:53:40.319373 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:40.319299 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:53:43.198126 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:43.198091 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj"] Apr 22 20:53:43.198580 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:43.198384 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" containerID="cri-o://5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4" gracePeriod=30 Apr 22 20:53:46.231841 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.231813 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:53:46.296826 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.296762 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400d87cb-6f81-444b-a147-9ccf6c6ba26b-kserve-provision-location\") pod \"400d87cb-6f81-444b-a147-9ccf6c6ba26b\" (UID: \"400d87cb-6f81-444b-a147-9ccf6c6ba26b\") " Apr 22 20:53:46.297071 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.297047 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/400d87cb-6f81-444b-a147-9ccf6c6ba26b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "400d87cb-6f81-444b-a147-9ccf6c6ba26b" (UID: "400d87cb-6f81-444b-a147-9ccf6c6ba26b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:53:46.397968 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.397939 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/400d87cb-6f81-444b-a147-9ccf6c6ba26b-kserve-provision-location\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 22 20:53:46.485443 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.485417 2579 generic.go:358] "Generic (PLEG): container finished" podID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerID="5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4" exitCode=0 Apr 22 20:53:46.485541 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.485456 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" event={"ID":"400d87cb-6f81-444b-a147-9ccf6c6ba26b","Type":"ContainerDied","Data":"5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4"} Apr 22 20:53:46.485541 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.485478 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" event={"ID":"400d87cb-6f81-444b-a147-9ccf6c6ba26b","Type":"ContainerDied","Data":"80d145a8c01834aa2127d3f1f2b9a8f12056413ccec083f143a101e5636b7d80"} Apr 22 20:53:46.485541 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.485484 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj" Apr 22 20:53:46.485541 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.485491 2579 scope.go:117] "RemoveContainer" containerID="5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4" Apr 22 20:53:46.493675 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.493659 2579 scope.go:117] "RemoveContainer" containerID="bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5" Apr 22 20:53:46.502629 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.502613 2579 scope.go:117] "RemoveContainer" containerID="5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4" Apr 22 20:53:46.502851 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:53:46.502835 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4\": container with ID starting with 5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4 not found: ID does not exist" containerID="5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4" Apr 22 20:53:46.502921 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.502862 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4"} err="failed to get container status \"5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4\": rpc error: code = NotFound desc = could not find container \"5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4\": container with ID starting with 5433ba9aea163265afcca029f05c48d201082137e9d72726da7caf60824ef4b4 not found: ID does not exist" Apr 22 20:53:46.502921 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.502886 2579 scope.go:117] "RemoveContainer" containerID="bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5" Apr 22 20:53:46.503266 ip-10-0-141-46 kubenswrapper[2579]: E0422 20:53:46.503244 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5\": container with ID starting with bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5 not found: ID does not exist" containerID="bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5" Apr 22 20:53:46.503350 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.503272 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5"} err="failed to get container status \"bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5\": rpc error: code = NotFound desc = could not find container \"bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5\": container with ID starting with bb622b775d592f331bbf861f28f2e008ee181392c5c3f43075bcaefdf88087c5 not found: ID does not exist" Apr 22 20:53:46.504388 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.504370 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj"] Apr 22 20:53:46.507832 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.507815 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-bfqwj"] Apr 22 20:53:46.695980 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:53:46.695953 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" path="/var/lib/kubelet/pods/400d87cb-6f81-444b-a147-9ccf6c6ba26b/volumes" Apr 22 20:59:58.475072 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:59:58.474996 2579 ???:1] "http2: server: error reading preface from client 10.0.141.46:44488: read tcp 10.0.141.46:10250->10.0.141.46:44488: read: connection reset by peer" Apr 22 20:59:58.486843 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:59:58.486816 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vlm4l_d0402063-8f80-4f0b-8247-b3bd2ae51e51/global-pull-secret-syncer/0.log" Apr 22 20:59:58.531484 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:59:58.531461 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4jvwg_f2e80311-0394-4386-8366-ef53a6861178/konnectivity-agent/0.log" Apr 22 20:59:58.670344 ip-10-0-141-46 kubenswrapper[2579]: I0422 20:59:58.670324 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-46.ec2.internal_6551c028e13aff466c38397d8a508ac4/haproxy/0.log" Apr 22 21:00:02.318733 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:02.318707 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sk4xd_5675d165-27d8-4220-9ec2-40aecf150447/node-exporter/0.log" Apr 22 21:00:02.340618 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:02.340595 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sk4xd_5675d165-27d8-4220-9ec2-40aecf150447/kube-rbac-proxy/0.log" Apr 22 21:00:02.358486 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:02.358466 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sk4xd_5675d165-27d8-4220-9ec2-40aecf150447/init-textfile/0.log" Apr 22 21:00:05.891434 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:05.891396 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4sbb4_7c89560e-578e-4ca8-bb28-9608c190c546/dns/0.log" Apr 22 21:00:05.915590 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:05.915569 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4sbb4_7c89560e-578e-4ca8-bb28-9608c190c546/kube-rbac-proxy/0.log" Apr 22 21:00:05.993232 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:05.993208 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vk6km_603cfe2d-1c53-4bc8-acc0-b1d1751c2817/dns-node-resolver/0.log" Apr 22 21:00:06.365359 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.365274 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd"] Apr 22 21:00:06.365617 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.365602 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="storage-initializer" Apr 22 21:00:06.365659 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.365619 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="storage-initializer" Apr 22 21:00:06.365659 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.365632 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" Apr 22 21:00:06.365659 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.365637 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" Apr 22 21:00:06.365772 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.365688 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="400d87cb-6f81-444b-a147-9ccf6c6ba26b" containerName="kserve-container" Apr 22 21:00:06.368511 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.368497 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.370394 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.370372 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mrjvk\"/\"openshift-service-ca.crt\"" Apr 22 21:00:06.370528 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.370392 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mrjvk\"/\"default-dockercfg-l7rpd\"" Apr 22 21:00:06.371077 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.371063 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mrjvk\"/\"kube-root-ca.crt\"" Apr 22 21:00:06.377081 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.377057 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd"] Apr 22 21:00:06.468056 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.468023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-proc\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.468256 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.468064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-sys\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.468256 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.468102 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-lib-modules\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.468256 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.468172 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-podres\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.468256 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.468198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5mnn\" (UniqueName: \"kubernetes.io/projected/de1a661e-944c-4742-965d-4d6d3a702e0e-kube-api-access-z5mnn\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.557283 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.557260 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xzq74_799125ad-367b-42aa-a2e1-222e89529bac/node-ca/0.log" Apr 22 21:00:06.568948 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.568922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-podres\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.569070 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.568952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5mnn\" (UniqueName: \"kubernetes.io/projected/de1a661e-944c-4742-965d-4d6d3a702e0e-kube-api-access-z5mnn\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.569070 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.568981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-proc\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.569070 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.569033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-sys\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.569274 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.569086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-lib-modules\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.569274 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.569124 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-proc\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.569274 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.569160 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-podres\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.569274 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.569167 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-sys\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.569274 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.569253 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de1a661e-944c-4742-965d-4d6d3a702e0e-lib-modules\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.575740 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.575721 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5mnn\" (UniqueName: \"kubernetes.io/projected/de1a661e-944c-4742-965d-4d6d3a702e0e-kube-api-access-z5mnn\") pod \"perf-node-gather-daemonset-wwstd\" (UID: \"de1a661e-944c-4742-965d-4d6d3a702e0e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.677981 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.677946 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:06.793034 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.793003 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd"] Apr 22 21:00:06.795522 ip-10-0-141-46 kubenswrapper[2579]: W0422 21:00:06.795491 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podde1a661e_944c_4742_965d_4d6d3a702e0e.slice/crio-03132d054372a072a2e590554e9113ad28226c304d4555939a050f9f120b4272 WatchSource:0}: Error finding container 03132d054372a072a2e590554e9113ad28226c304d4555939a050f9f120b4272: Status 404 returned error can't find the container with id 03132d054372a072a2e590554e9113ad28226c304d4555939a050f9f120b4272 Apr 22 21:00:06.797222 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:06.797203 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:00:07.416549 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:07.416511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" event={"ID":"de1a661e-944c-4742-965d-4d6d3a702e0e","Type":"ContainerStarted","Data":"d88deb0b2d5d6cbbcd1f3cf5fdb24a036aab7423f524c8100eae6216bc905d31"} Apr 22 21:00:07.416549 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:07.416545 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" event={"ID":"de1a661e-944c-4742-965d-4d6d3a702e0e","Type":"ContainerStarted","Data":"03132d054372a072a2e590554e9113ad28226c304d4555939a050f9f120b4272"} Apr 22 21:00:07.416949 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:07.416631 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:07.432323 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:07.432278 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" podStartSLOduration=1.432264256 podStartE2EDuration="1.432264256s" podCreationTimestamp="2026-04-22 21:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:00:07.431388768 +0000 UTC m=+3713.385751569" watchObservedRunningTime="2026-04-22 21:00:07.432264256 +0000 UTC m=+3713.386627057" Apr 22 21:00:13.430395 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:13.430359 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-wwstd" Apr 22 21:00:53.585846 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:53.585815 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wm9mw_ea808e89-1697-4235-8c42-8202cc97fef9/serve-healthcheck-canary/0.log" Apr 22 21:00:53.982885 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:53.982859 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b8fc9_ea34c500-4ae8-491f-a03b-da5186e37c7d/kube-rbac-proxy/0.log" Apr 22 21:00:54.000635 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:54.000614 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b8fc9_ea34c500-4ae8-491f-a03b-da5186e37c7d/exporter/0.log" Apr 22 21:00:54.019326 ip-10-0-141-46 kubenswrapper[2579]: I0422 21:00:54.019307 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b8fc9_ea34c500-4ae8-491f-a03b-da5186e37c7d/extractor/0.log"