Apr 16 16:00:18.572264 ip-10-0-135-144 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:00:18.572277 ip-10-0-135-144 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:00:18.572287 ip-10-0-135-144 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:00:18.572600 ip-10-0-135-144 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:00:28.802603 ip-10-0-135-144 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:00:28.802626 ip-10-0-135-144 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f1ff8eaf7c99486697de170d2cb9f69b -- Apr 16 16:02:51.717647 ip-10-0-135-144 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:02:52.113765 ip-10-0-135-144 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:52.113765 ip-10-0-135-144 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:02:52.113765 ip-10-0-135-144 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:52.113765 ip-10-0-135-144 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:02:52.113765 ip-10-0-135-144 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:52.117100 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.116853 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:02:52.121798 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121780 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:52.121798 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121798 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121802 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121806 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121810 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121813 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121816 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121819 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121822 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121825 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121828 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121830 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121833 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121836 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121838 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121841 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121844 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121846 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121849 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121851 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:52.121865 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121854 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121865 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121868 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121871 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121874 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121877 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121880 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121882 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121885 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121888 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121890 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121893 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121895 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121897 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121900 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121903 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121905 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121908 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121910 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121913 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:52.122305 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121915 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121918 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121920 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121923 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121925 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121928 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121931 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121933 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121936 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121938 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121942 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121945 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121947 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121950 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121953 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121957 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121961 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121964 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121966 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:52.122832 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121970 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121974 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121979 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121982 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121985 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121988 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121990 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121993 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.121996 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122000 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122002 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122005 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122008 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122011 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122013 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122016 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122018 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122021 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122023 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122027 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:52.123290 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122030 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122033 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122035 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122038 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122041 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122043 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122046 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122486 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122492 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122495 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122498 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122501 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122504 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122507 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122510 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122512 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122515 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122518 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122520 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:52.123781 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122524 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122528 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122531 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122534 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122537 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122540 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122542 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122545 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122547 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122550 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122553 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122556 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122559 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122561 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122564 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122566 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122569 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122572 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122575 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122577 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:52.124231 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122581 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122583 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122586 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122589 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122591 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122594 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122596 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122599 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122601 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122604 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122607 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122609 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122612 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122614 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122617 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122619 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122622 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122624 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122627 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122629 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:52.124757 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122632 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122635 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122638 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122640 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122643 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122646 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122648 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122651 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122654 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122657 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122659 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122663 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122666 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122668 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122671 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122673 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122675 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122678 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122680 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122683 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:52.125268 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122685 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122688 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122692 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122695 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122698 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122701 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122703 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122705 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122708 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122710 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122713 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122715 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122718 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.122721 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122803 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122815 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122826 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122833 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122840 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122846 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122852 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:02:52.125837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122857 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122860 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122864 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122867 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122871 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122874 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122877 2581 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122880 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122883 2581 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122886 2581 flags.go:64] FLAG: --cloud-config="" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122889 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122892 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122897 2581 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122900 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122903 2581 flags.go:64] FLAG: --config-dir="" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122905 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122909 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122913 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122916 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122919 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122922 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122925 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122929 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122932 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122935 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:02:52.126365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122938 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122942 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122954 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122957 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122961 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122964 2581 flags.go:64] FLAG: --enable-server="true" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122967 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122972 2581 flags.go:64] FLAG: --event-burst="100" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122975 2581 flags.go:64] FLAG: --event-qps="50" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122978 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122981 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122985 2581 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122988 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122991 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122994 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.122997 2581 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123000 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123004 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123007 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123010 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123013 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123015 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123018 2581 flags.go:64] FLAG: --feature-gates="" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123022 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123025 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:02:52.126979 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123028 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123032 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123035 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123038 2581 flags.go:64] FLAG: --help="false" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123041 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123044 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123047 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123050 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123053 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123058 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123062 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123065 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123068 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123071 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123074 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123077 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123080 2581 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123083 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123086 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123089 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123092 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123095 2581 flags.go:64] FLAG: --lock-file="" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123097 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123100 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:02:52.127589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123103 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123109 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123112 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123115 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123117 2581 flags.go:64] FLAG: --logging-format="text" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123120 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123124 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123127 2581 flags.go:64] FLAG: --manifest-url="" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123130 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123134 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123137 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123141 2581 flags.go:64] FLAG: --max-pods="110" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123144 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123147 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123150 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123153 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123155 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123163 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123166 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123175 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123178 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123181 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123185 2581 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:02:52.128164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123188 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123194 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123197 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123200 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123204 2581 flags.go:64] FLAG: --port="10250" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123207 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123210 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0198c139fd0ab7c9b" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123213 2581 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123216 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123219 2581 flags.go:64] FLAG: --register-node="true" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123221 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123224 2581 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123228 2581 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123231 2581 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123234 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123237 2581 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123241 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123244 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123247 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123250 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123253 2581 flags.go:64] FLAG: --runonce="false" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123256 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123259 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123262 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123265 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123268 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:02:52.128768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123272 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123275 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123278 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123282 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123285 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123288 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123291 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123294 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123298 2581 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123301 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123307 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123310 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123313 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123317 2581 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123320 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123322 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123325 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123328 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123331 2581 flags.go:64] FLAG: --v="2" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123348 2581 flags.go:64] FLAG: --version="false" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123352 2581 flags.go:64] FLAG: --vmodule="" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123357 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.123360 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123485 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:52.129421 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123489 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123493 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123496 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123499 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123502 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123504 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123507 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123509 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123514 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123517 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123520 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123522 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123529 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123532 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123534 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123537 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123540 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123544 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123547 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123550 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:52.130067 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123553 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123556 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123559 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123563 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123567 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123570 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123573 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123576 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123579 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123582 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123585 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123588 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123590 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123593 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123596 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123598 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123601 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123604 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123606 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:52.130594 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123609 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123613 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123615 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123619 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123621 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123626 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123629 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123632 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123635 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123637 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123640 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123642 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123645 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123648 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123650 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123653 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123655 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123658 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123660 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123663 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:52.131062 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123666 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123668 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123670 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123673 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123675 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123678 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123681 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123683 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123685 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123688 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123691 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123693 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123696 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123699 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123702 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123705 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123707 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123711 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123714 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123722 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:52.131570 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123724 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123727 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123730 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123732 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123735 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.123737 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.124316 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.130950 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.130967 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131018 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131022 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131026 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131029 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131032 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131035 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:52.132066 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131039 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131041 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131046 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131050 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131054 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131056 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131059 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131062 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131066 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131069 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131072 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131075 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131079 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131081 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131084 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131087 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131089 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131092 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131095 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:52.132516 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131097 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131100 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131103 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131105 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131109 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131113 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131117 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131120 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131123 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131126 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131128 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131131 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131134 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131136 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131139 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131142 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131145 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131147 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131150 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131152 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:52.133001 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131155 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131157 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131160 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131163 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131165 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131168 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131171 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131173 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131176 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131178 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131181 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131184 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131186 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131188 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131191 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131194 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131196 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131199 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131202 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131209 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:52.133526 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131211 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131214 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131217 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131219 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131222 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131225 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131228 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131230 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131233 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131235 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131238 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131240 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131243 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131246 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131248 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131251 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131253 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131256 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131258 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:52.134007 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131261 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131263 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.131269 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131389 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131397 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131400 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131404 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131407 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131410 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131412 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131415 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131418 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131421 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131425 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131428 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131430 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:52.134485 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131433 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131436 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131438 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131441 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131444 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131447 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131449 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131452 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131455 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131457 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131460 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131462 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131465 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131468 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131470 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131473 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131475 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131478 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131481 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131483 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:52.134890 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131486 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131488 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131491 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131495 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131498 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131501 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131505 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131507 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131510 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131513 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131523 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131525 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131528 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131531 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131534 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131536 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131539 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131541 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131544 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:52.135409 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131546 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131549 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131551 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131554 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131556 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131559 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131561 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131564 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131567 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131569 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131572 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131574 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131577 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131580 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131583 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131586 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131589 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131591 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131594 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131597 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:52.135862 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131600 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131603 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131605 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131608 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131618 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131621 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131623 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131626 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131628 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131631 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131633 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131636 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131639 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:52.131641 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.131646 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:52.136364 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.132295 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:02:52.136744 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.134276 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:02:52.136744 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.135177 2581 server.go:1019] "Starting client certificate rotation" Apr 16 16:02:52.136744 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.135270 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:02:52.136744 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.136006 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:02:52.156757 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.156728 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:02:52.158410 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.158376 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:02:52.177115 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.177088 2581 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:02:52.181728 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.181712 2581 log.go:25] "Validated CRI v1 image API" Apr 16 16:02:52.182996 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.182974 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:02:52.189003 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.188975 2581 fs.go:135] Filesystem UUIDs: map[544d59e3-988c-477f-866f-578dcafdeeb9:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 dbc4c04f-1493-409e-ba9d-e70c9abe042a:/dev/nvme0n1p3] Apr 16 16:02:52.189072 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.189003 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:02:52.195198 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.195079 2581 manager.go:217] Machine: {Timestamp:2026-04-16 16:02:52.193353659 +0000 UTC m=+0.370414726 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101103 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27e75a53b994beb31145fe1dd27d6e SystemUUID:ec27e75a-53b9-94be-b311-45fe1dd27d6e BootID:f1ff8eaf-7c99-4866-97de-170d2cb9f69b Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:89:02:cf:85:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:89:02:cf:85:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:97:8c:34:3a:61 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:02:52.195198 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.195193 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:02:52.195311 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.195284 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:02:52.196283 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.196253 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:02:52.196435 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.196286 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:02:52.196484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.196444 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:02:52.196484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.196453 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:02:52.196484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.196466 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:02:52.197151 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.197140 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:02:52.199201 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.199187 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:02:52.199365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.199355 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:02:52.201476 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.201465 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:02:52.201510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.201497 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:02:52.201543 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.201516 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:02:52.201543 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.201532 2581 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:02:52.201597 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.201545 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:02:52.201645 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.201625 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:02:52.202581 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.202567 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:02:52.202663 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.202586 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:02:52.205308 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.205293 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:02:52.207071 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.207054 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:02:52.208383 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208368 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:02:52.208471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208403 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:02:52.208471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208413 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:02:52.208471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208423 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:02:52.208471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208431 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:02:52.208471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208440 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:02:52.208471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208449 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:02:52.208471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208458 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:02:52.208471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208468 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:02:52.208716 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208477 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:02:52.208716 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208499 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:02:52.208716 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.208512 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:02:52.209233 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.209222 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:02:52.209285 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.209235 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:02:52.212889 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.212874 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:02:52.212984 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.212919 2581 server.go:1295] "Started kubelet" Apr 16 16:02:52.213034 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.213002 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:02:52.213162 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.213099 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:02:52.213205 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.213187 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:02:52.213952 ip-10-0-135-144 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:02:52.214620 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.214410 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:02:52.214821 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.214795 2581 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:02:52.215232 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.215205 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:02:52.215232 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.215213 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:02:52.215806 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.215789 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:02:52.221198 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.221180 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:02:52.221552 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.221536 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:02:52.222144 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222118 2581 factory.go:55] Registering systemd factory Apr 16 16:02:52.222144 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222137 2581 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:02:52.222458 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222429 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:02:52.222458 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222454 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:02:52.222578 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222438 2581 factory.go:153] Registering CRI-O factory Apr 16 16:02:52.222578 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222560 2581 factory.go:223] Registration of the crio container factory successfully Apr 16 16:02:52.222779 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.222747 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:52.222893 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222810 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:02:52.222893 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222818 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:02:52.223044 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.222924 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:02:52.223234 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.223215 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:02:52.223323 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.223270 2581 factory.go:103] Registering Raw factory Apr 16 16:02:52.223323 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.223289 2581 manager.go:1196] Started watching for new ooms in manager Apr 16 16:02:52.223607 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.222661 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-144.ec2.internal.18a6e1ccab9b1ef4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-144.ec2.internal,UID:ip-10-0-135-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-144.ec2.internal,},FirstTimestamp:2026-04-16 16:02:52.212887284 +0000 UTC m=+0.389948334,LastTimestamp:2026-04-16 16:02:52.212887284 +0000 UTC m=+0.389948334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-144.ec2.internal,}" Apr 16 16:02:52.224469 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.224450 2581 manager.go:319] Starting recovery of all containers Apr 16 16:02:52.224757 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.224737 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:02:52.230229 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.230094 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:02:52.234493 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.234475 2581 manager.go:324] Recovery completed Apr 16 16:02:52.235135 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.235079 2581 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:02:52.235222 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.235202 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:02:52.238953 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.238941 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:52.241773 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.241759 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:52.241828 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.241788 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:52.241828 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.241798 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:52.242247 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.242232 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:02:52.242247 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.242245 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:02:52.242364 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.242259 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:02:52.244556 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.244545 2581 policy_none.go:49] "None policy: Start" Apr 16 16:02:52.244618 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.244561 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:02:52.244618 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.244570 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:02:52.245505 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.245441 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-144.ec2.internal.18a6e1ccad53e7bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-144.ec2.internal,UID:ip-10-0-135-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-144.ec2.internal,},FirstTimestamp:2026-04-16 16:02:52.241774524 +0000 UTC m=+0.418835570,LastTimestamp:2026-04-16 16:02:52.241774524 +0000 UTC m=+0.418835570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-144.ec2.internal,}" Apr 16 16:02:52.259365 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.259280 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-144.ec2.internal.18a6e1ccad542d60 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-144.ec2.internal,UID:ip-10-0-135-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-135-144.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-135-144.ec2.internal,},FirstTimestamp:2026-04-16 16:02:52.241792352 +0000 UTC m=+0.418853398,LastTimestamp:2026-04-16 16:02:52.241792352 +0000 UTC m=+0.418853398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-144.ec2.internal,}" Apr 16 16:02:52.275385 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.275284 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-144.ec2.internal.18a6e1ccad545370 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-144.ec2.internal,UID:ip-10-0-135-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-135-144.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-135-144.ec2.internal,},FirstTimestamp:2026-04-16 16:02:52.241802096 +0000 UTC m=+0.418863141,LastTimestamp:2026-04-16 16:02:52.241802096 +0000 UTC m=+0.418863141,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-144.ec2.internal,}" Apr 16 16:02:52.277834 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.277819 2581 manager.go:341] "Starting Device Plugin manager" Apr 16 16:02:52.277922 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.277857 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:02:52.277922 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.277881 2581 server.go:85] "Starting device plugin registration server" Apr 16 16:02:52.278169 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.278152 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:02:52.278276 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.278171 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:02:52.278276 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.278266 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:02:52.278543 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.278401 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:02:52.278543 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.278412 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:02:52.296420 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.278985 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:02:52.296420 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.279026 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:52.306300 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.306274 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f5dgt" Apr 16 16:02:52.307660 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.307585 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-144.ec2.internal.18a6e1ccb011830b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-144.ec2.internal,UID:ip-10-0-135-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-135-144.ec2.internal,},FirstTimestamp:2026-04-16 16:02:52.287755019 +0000 UTC m=+0.464816052,LastTimestamp:2026-04-16 16:02:52.287755019 +0000 UTC m=+0.464816052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-144.ec2.internal,}" Apr 16 16:02:52.318403 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.318383 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f5dgt" Apr 16 16:02:52.371050 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.370974 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:02:52.371050 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.371009 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:02:52.371050 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.371029 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:02:52.371050 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.371036 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:02:52.371314 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.371068 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:02:52.378992 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.378968 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:52.380550 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.380521 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:52.380679 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.380557 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:52.380679 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.380567 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:52.380679 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.380591 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.384814 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.384788 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:52.395421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.395403 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.395528 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.395428 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-144.ec2.internal\": node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:52.442215 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.442183 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:52.472059 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.472025 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal"] Apr 16 16:02:52.472193 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.472103 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:52.473609 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.473592 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:52.473718 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.473621 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:52.473718 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.473633 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:52.474819 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.474805 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:52.474978 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.474950 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.475040 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.474996 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:52.475579 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.475563 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:52.475675 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.475590 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:52.475675 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.475605 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:52.475675 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.475616 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:52.475675 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.475628 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:52.475675 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.475643 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:52.477386 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.477372 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.477450 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.477399 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:52.478124 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.478109 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:52.478193 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.478134 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:52.478193 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.478148 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:52.507127 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.507104 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-144.ec2.internal\" not found" node="ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.511696 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.511681 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-144.ec2.internal\" not found" node="ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.524957 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.524937 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6668c770a04eab2dde37a1f32511d490-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"6668c770a04eab2dde37a1f32511d490\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.525020 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.524962 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb09a532e924a099c0b64a77b795e41f-config\") pod \"kube-apiserver-proxy-ip-10-0-135-144.ec2.internal\" (UID: \"bb09a532e924a099c0b64a77b795e41f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.525020 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.524982 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6668c770a04eab2dde37a1f32511d490-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"6668c770a04eab2dde37a1f32511d490\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.542537 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.542516 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:52.625258 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.625188 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb09a532e924a099c0b64a77b795e41f-config\") pod \"kube-apiserver-proxy-ip-10-0-135-144.ec2.internal\" (UID: \"bb09a532e924a099c0b64a77b795e41f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.625258 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.625221 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6668c770a04eab2dde37a1f32511d490-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"6668c770a04eab2dde37a1f32511d490\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.625258 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.625240 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6668c770a04eab2dde37a1f32511d490-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"6668c770a04eab2dde37a1f32511d490\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.625454 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.625297 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb09a532e924a099c0b64a77b795e41f-config\") pod \"kube-apiserver-proxy-ip-10-0-135-144.ec2.internal\" (UID: \"bb09a532e924a099c0b64a77b795e41f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.625454 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.625305 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6668c770a04eab2dde37a1f32511d490-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"6668c770a04eab2dde37a1f32511d490\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.625454 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.625305 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6668c770a04eab2dde37a1f32511d490-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"6668c770a04eab2dde37a1f32511d490\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.642710 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.642686 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:52.743351 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.743308 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:52.810541 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.810516 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.814017 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:52.814000 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:52.844418 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.844382 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:52.945096 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:52.944986 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:53.045625 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:53.045585 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:53.135171 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.135143 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:02:53.135722 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.135269 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:02:53.146318 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:53.146286 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:53.222388 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.222310 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:02:53.236094 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.236070 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:02:53.246973 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:53.246944 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:53.268444 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.268415 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4k8m7" Apr 16 16:02:53.276166 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.276143 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4k8m7" Apr 16 16:02:53.320502 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.320458 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:57:52 +0000 UTC" deadline="2028-02-02 10:39:57.959139526 +0000 UTC" Apr 16 16:02:53.320502 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.320493 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15762h37m4.6386498s" Apr 16 16:02:53.342753 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:53.342711 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb09a532e924a099c0b64a77b795e41f.slice/crio-7e3301e57e06c2e3559bb84b6e262a1400cc5774f06863601570727ca3bc7bfb WatchSource:0}: Error finding container 7e3301e57e06c2e3559bb84b6e262a1400cc5774f06863601570727ca3bc7bfb: Status 404 returned error can't find the container with id 7e3301e57e06c2e3559bb84b6e262a1400cc5774f06863601570727ca3bc7bfb Apr 16 16:02:53.342989 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:53.342969 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6668c770a04eab2dde37a1f32511d490.slice/crio-33135a412e576268af8b4ef56ba00e1c39b11e460e434b7dcbcb7f3f98196d50 WatchSource:0}: Error finding container 33135a412e576268af8b4ef56ba00e1c39b11e460e434b7dcbcb7f3f98196d50: Status 404 returned error can't find the container with id 33135a412e576268af8b4ef56ba00e1c39b11e460e434b7dcbcb7f3f98196d50 Apr 16 16:02:53.347091 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:53.347068 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:53.348266 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.348252 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:02:53.373846 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.373799 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" event={"ID":"6668c770a04eab2dde37a1f32511d490","Type":"ContainerStarted","Data":"33135a412e576268af8b4ef56ba00e1c39b11e460e434b7dcbcb7f3f98196d50"} Apr 16 16:02:53.374719 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.374699 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" event={"ID":"bb09a532e924a099c0b64a77b795e41f","Type":"ContainerStarted","Data":"7e3301e57e06c2e3559bb84b6e262a1400cc5774f06863601570727ca3bc7bfb"} Apr 16 16:02:53.447175 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:53.447144 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:53.545689 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.545610 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:53.547628 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:53.547611 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:53.648435 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:53.648406 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 16 16:02:53.650573 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.650551 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:53.721902 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.721861 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 16 16:02:53.736423 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.736398 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:02:53.737355 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.737330 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 16 16:02:53.750278 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.750255 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:02:53.760489 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:53.760466 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:54.029290 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.029258 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:54.203478 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.203400 2581 apiserver.go:52] "Watching apiserver" Apr 16 16:02:54.213051 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.212857 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:02:54.213370 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.213323 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-cqtjh","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm","openshift-cluster-node-tuning-operator/tuned-662l5","openshift-dns/node-resolver-sjpbs","openshift-multus/multus-additional-cni-plugins-ml8fq","openshift-multus/multus-fph7l","openshift-network-diagnostics/network-check-target-m7vqg","openshift-network-operator/iptables-alerter-vq8l4","kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal","openshift-image-registry/node-ca-t5mlw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal","openshift-multus/network-metrics-daemon-rgfkx","openshift-ovn-kubernetes/ovnkube-node-rdqjm"] Apr 16 16:02:54.215774 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.215747 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:02:54.215902 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.215831 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:02:54.217097 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.217072 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.218483 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.218405 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.221201 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.220768 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.221201 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.220873 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.222156 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.222036 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.223301 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.223280 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:02:54.224822 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.224783 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.226327 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.226309 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:02:54.227386 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.227329 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.228827 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.228706 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:54.228827 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.228780 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:02:54.230199 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.230179 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.233092 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233068 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-sys-fs\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.233188 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233106 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-etc-kubernetes\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.233188 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233131 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-os-release\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.233188 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233156 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-cni-multus\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.233188 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233182 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-tmp\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233410 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233251 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2pj8\" (UniqueName: \"kubernetes.io/projected/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-kube-api-access-g2pj8\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.233410 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233307 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-k8s-cni-cncf-io\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.233410 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233353 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-modprobe-d\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233410 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233392 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-cni-bin\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.233598 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233423 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-multus-certs\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.233598 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233449 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2r9n\" (UniqueName: \"kubernetes.io/projected/40d4bde4-af0f-486a-90e6-101c53fe3e24-kube-api-access-v2r9n\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.233598 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233476 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-var-lib-kubelet\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233598 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233527 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-tuned\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233598 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233551 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysctl-d\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233598 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233575 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.233598 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233598 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-host-slash\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233620 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-lib-modules\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233648 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233676 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ef56dfc1-1254-428e-ba95-899e4b0e0908-konnectivity-ca\") pod \"konnectivity-agent-cqtjh\" (UID: \"ef56dfc1-1254-428e-ba95-899e4b0e0908\") " pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233699 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-hostroot\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233760 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-device-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233789 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233811 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-run\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233833 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-host\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233865 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-499rb\" (UniqueName: \"kubernetes.io/projected/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-kube-api-access-499rb\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233890 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-cni-binary-copy\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.233920 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233914 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxm84\" (UniqueName: \"kubernetes.io/projected/5012ef67-2e62-4b06-a2f2-ef785998d3cb-kube-api-access-dxm84\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233939 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-socket-dir-parent\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233961 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysconfig\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.233983 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysctl-conf\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234015 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-sys\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234048 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-kubelet\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234072 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-os-release\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234096 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-registration-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234187 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-netns\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234212 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ef56dfc1-1254-428e-ba95-899e4b0e0908-agent-certs\") pod \"konnectivity-agent-cqtjh\" (UID: \"ef56dfc1-1254-428e-ba95-899e4b0e0908\") " pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234248 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-socket-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234293 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-conf-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234324 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-cnibin\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234385 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234439 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2bs9\" (UniqueName: \"kubernetes.io/projected/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-kube-api-access-x2bs9\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.234468 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234472 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234491 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234510 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-tmp-dir\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234524 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-cni-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234539 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-kubernetes\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234556 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-iptables-alerter-script\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/40d4bde4-af0f-486a-90e6-101c53fe3e24-cni-binary-copy\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234615 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t86c\" (UniqueName: \"kubernetes.io/projected/fe31cde4-f24b-44d8-9e19-ba426c58b544-kube-api-access-2t86c\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234631 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-hosts-file\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234646 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-system-cni-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234659 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-cnibin\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234672 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-daemon-config\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234685 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-systemd\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.235180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.234709 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-system-cni-dir\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.236148 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.236117 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:02:54.236240 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.236120 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:02:54.236240 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.236231 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.237306 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.237654 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-crzgr\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.237691 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238125 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238227 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238252 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238383 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bbkpr\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238504 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238600 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238696 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238758 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238829 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238852 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.238985 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.239010 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bfcxf\"" Apr 16 16:02:54.239209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.239020 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:02:54.240134 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.239265 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8nt5q\"" Apr 16 16:02:54.240134 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.239311 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:02:54.240134 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.239442 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:02:54.240134 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.239558 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:02:54.240134 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.239585 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kkxt5\"" Apr 16 16:02:54.241136 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.240664 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:02:54.241926 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.241906 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rrbwl\"" Apr 16 16:02:54.242527 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.242504 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5h28f\"" Apr 16 16:02:54.242688 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.242666 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dpq8v\"" Apr 16 16:02:54.242751 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.242723 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:02:54.243011 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.242993 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:02:54.243091 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.243016 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:02:54.243180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.243165 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:02:54.243246 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.243229 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:02:54.243651 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.243632 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7lvrj\"" Apr 16 16:02:54.243651 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.243642 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:02:54.277256 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.277220 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:57:53 +0000 UTC" deadline="2027-09-22 06:16:27.251988383 +0000 UTC" Apr 16 16:02:54.277256 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.277252 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12566h13m32.974739478s" Apr 16 16:02:54.324194 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.324161 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:02:54.335138 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335100 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-run\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335138 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335143 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-host\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335375 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335170 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-499rb\" (UniqueName: \"kubernetes.io/projected/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-kube-api-access-499rb\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335375 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335244 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-host\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335375 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335244 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-run\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335375 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335272 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-cni-binary-copy\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.335375 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335305 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-kubelet\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.335375 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335321 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-log-socket\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.335375 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335359 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335389 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335420 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxm84\" (UniqueName: \"kubernetes.io/projected/5012ef67-2e62-4b06-a2f2-ef785998d3cb-kube-api-access-dxm84\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335456 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-socket-dir-parent\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335485 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysconfig\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysctl-conf\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335532 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-sys\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335549 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-socket-dir-parent\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335557 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-systemd-units\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335593 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-run-netns\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335589 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysconfig\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335618 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-sys\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335626 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-cni-bin\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335657 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-kubelet\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335690 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-os-release\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335719 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335727 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-kubelet\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.335775 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335724 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysctl-conf\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335745 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-registration-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335771 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-netns\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335791 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-os-release\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335796 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ef56dfc1-1254-428e-ba95-899e4b0e0908-agent-certs\") pod \"konnectivity-agent-cqtjh\" (UID: \"ef56dfc1-1254-428e-ba95-899e4b0e0908\") " pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335853 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-netns\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335859 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-registration-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335877 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-ovnkube-script-lib\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335906 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-socket-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335932 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-conf-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335946 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-cni-binary-copy\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335959 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-cnibin\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335985 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.335989 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-conf-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336014 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-cnibin\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2bs9\" (UniqueName: \"kubernetes.io/projected/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-kube-api-access-x2bs9\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336042 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-socket-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.336510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336060 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-var-lib-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336161 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336176 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-cni-netd\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336257 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtjn7\" (UniqueName: \"kubernetes.io/projected/2c718a57-bd23-432b-bf19-493fd2ad600a-kube-api-access-xtjn7\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336289 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336356 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-tmp-dir\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336381 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-cni-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336405 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-kubernetes\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336429 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-iptables-alerter-script\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336428 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336518 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-cni-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336456 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-env-overrides\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336540 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-kubernetes\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336567 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/40d4bde4-af0f-486a-90e6-101c53fe3e24-cni-binary-copy\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t86c\" (UniqueName: \"kubernetes.io/projected/fe31cde4-f24b-44d8-9e19-ba426c58b544-kube-api-access-2t86c\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336622 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-hosts-file\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336643 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-system-cni-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.337354 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336693 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-system-cni-dir\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336734 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-hosts-file\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336745 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-tmp-dir\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336827 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-cnibin\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336855 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-daemon-config\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336872 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336880 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-systemd\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336905 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-system-cni-dir\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336901 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-cnibin\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336933 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-sys-fs\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336958 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-etc-kubernetes\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336963 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-systemd\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.336985 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-ovn\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337009 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-os-release\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337037 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-cni-multus\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337050 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-iptables-alerter-script\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337063 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-tmp\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337075 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe31cde4-f24b-44d8-9e19-ba426c58b544-system-cni-dir\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.338180 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337123 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/40d4bde4-af0f-486a-90e6-101c53fe3e24-cni-binary-copy\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337137 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-etc-kubernetes\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337123 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-serviceca\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337025 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-sys-fs\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337174 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-cni-multus\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337176 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2pj8\" (UniqueName: \"kubernetes.io/projected/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-kube-api-access-g2pj8\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337205 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-k8s-cni-cncf-io\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337255 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-k8s-cni-cncf-io\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337265 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-os-release\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337284 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-modprobe-d\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337310 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-systemd\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337330 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-node-log\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337369 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-run-ovn-kubernetes\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337385 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-modprobe-d\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337397 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4bm2\" (UniqueName: \"kubernetes.io/projected/0aa611e2-18d8-4712-9938-e8c21daeb1a0-kube-api-access-r4bm2\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337368 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/40d4bde4-af0f-486a-90e6-101c53fe3e24-multus-daemon-config\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337425 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-cni-bin\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337451 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-multus-certs\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337475 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2r9n\" (UniqueName: \"kubernetes.io/projected/40d4bde4-af0f-486a-90e6-101c53fe3e24-kube-api-access-v2r9n\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337499 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-var-lib-kubelet\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337505 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-var-lib-cni-bin\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337520 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-tuned\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337521 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-host-run-multus-certs\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337545 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-slash\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337585 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-var-lib-kubelet\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337627 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c718a57-bd23-432b-bf19-493fd2ad600a-ovn-node-metrics-cert\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337674 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysctl-d\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337700 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337768 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-host-slash\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337795 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-sysctl-d\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337810 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-lib-modules\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337837 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337844 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-host-slash\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337863 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ef56dfc1-1254-428e-ba95-899e4b0e0908-konnectivity-ca\") pod \"konnectivity-agent-cqtjh\" (UID: \"ef56dfc1-1254-428e-ba95-899e4b0e0908\") " pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:02:54.339798 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337890 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-host\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337917 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfctx\" (UniqueName: \"kubernetes.io/projected/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-kube-api-access-nfctx\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337933 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-lib-modules\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337942 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-hostroot\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-device-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.337992 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-etc-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.338020 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-ovnkube-config\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.338049 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.338050 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/40d4bde4-af0f-486a-90e6-101c53fe3e24-hostroot\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.338132 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.338194 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.338204 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5012ef67-2e62-4b06-a2f2-ef785998d3cb-device-dir\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.338405 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fe31cde4-f24b-44d8-9e19-ba426c58b544-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.338552 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ef56dfc1-1254-428e-ba95-899e4b0e0908-konnectivity-ca\") pod \"konnectivity-agent-cqtjh\" (UID: \"ef56dfc1-1254-428e-ba95-899e4b0e0908\") " pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.339746 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-tmp\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.339924 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ef56dfc1-1254-428e-ba95-899e4b0e0908-agent-certs\") pod \"konnectivity-agent-cqtjh\" (UID: \"ef56dfc1-1254-428e-ba95-899e4b0e0908\") " pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:02:54.340293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.340211 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-etc-tuned\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.359688 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.359650 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:54.359688 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.359675 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:54.359688 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.359689 2581 projected.go:194] Error preparing data for projected volume kube-api-access-6ktgq for pod openshift-network-diagnostics/network-check-target-m7vqg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:54.359918 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.359780 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq podName:313de001-22f6-48de-8e2b-ba59ee1494ec nodeName:}" failed. No retries permitted until 2026-04-16 16:02:54.859733257 +0000 UTC m=+3.036794307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktgq" (UniqueName: "kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq") pod "network-check-target-m7vqg" (UID: "313de001-22f6-48de-8e2b-ba59ee1494ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:54.360004 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.359983 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2r9n\" (UniqueName: \"kubernetes.io/projected/40d4bde4-af0f-486a-90e6-101c53fe3e24-kube-api-access-v2r9n\") pod \"multus-fph7l\" (UID: \"40d4bde4-af0f-486a-90e6-101c53fe3e24\") " pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.360096 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.360076 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t86c\" (UniqueName: \"kubernetes.io/projected/fe31cde4-f24b-44d8-9e19-ba426c58b544-kube-api-access-2t86c\") pod \"multus-additional-cni-plugins-ml8fq\" (UID: \"fe31cde4-f24b-44d8-9e19-ba426c58b544\") " pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.360586 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.360558 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2pj8\" (UniqueName: \"kubernetes.io/projected/70dfab46-81b5-47d9-b69d-a3d94a7c5e13-kube-api-access-g2pj8\") pod \"node-resolver-sjpbs\" (UID: \"70dfab46-81b5-47d9-b69d-a3d94a7c5e13\") " pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.362352 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.362320 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2bs9\" (UniqueName: \"kubernetes.io/projected/3143666f-7f83-4d6e-ae14-75fdaf4f8e7c-kube-api-access-x2bs9\") pod \"iptables-alerter-vq8l4\" (UID: \"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c\") " pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.363594 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.363570 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxm84\" (UniqueName: \"kubernetes.io/projected/5012ef67-2e62-4b06-a2f2-ef785998d3cb-kube-api-access-dxm84\") pod \"aws-ebs-csi-driver-node-c6kvm\" (UID: \"5012ef67-2e62-4b06-a2f2-ef785998d3cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.365518 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.365492 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-499rb\" (UniqueName: \"kubernetes.io/projected/2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee-kube-api-access-499rb\") pod \"tuned-662l5\" (UID: \"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee\") " pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.438692 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438640 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-var-lib-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.438692 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438692 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-cni-netd\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.438935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438715 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjn7\" (UniqueName: \"kubernetes.io/projected/2c718a57-bd23-432b-bf19-493fd2ad600a-kube-api-access-xtjn7\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.438935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438753 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-env-overrides\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.438935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438782 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-ovn\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.438935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438783 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-cni-netd\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.438935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438806 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-serviceca\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.438935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438834 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-var-lib-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.438935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438855 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-systemd\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.438935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438883 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-node-log\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438940 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-run-ovn-kubernetes\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438968 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4bm2\" (UniqueName: \"kubernetes.io/projected/0aa611e2-18d8-4712-9938-e8c21daeb1a0-kube-api-access-r4bm2\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.438998 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-slash\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439025 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c718a57-bd23-432b-bf19-493fd2ad600a-ovn-node-metrics-cert\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439047 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-ovn\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439054 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-host\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439094 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-host\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439098 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfctx\" (UniqueName: \"kubernetes.io/projected/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-kube-api-access-nfctx\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439129 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-etc-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439134 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-systemd\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439155 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-ovnkube-config\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439169 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-node-log\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439185 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-kubelet\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439207 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-run-ovn-kubernetes\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439210 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-log-socket\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439241 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-log-socket\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.439293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439279 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439302 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-serviceca\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439373 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-env-overrides\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439415 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-etc-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439431 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-kubelet\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439241 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439470 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439499 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-systemd-units\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439524 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-run-netns\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439549 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-cni-bin\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439562 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-systemd-units\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439576 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439607 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-ovnkube-script-lib\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.439631 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439525 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-slash\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439672 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-run-openvswitch\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.439697 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:54.93968114 +0000 UTC m=+3.116742173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:54.440741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439691 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-run-netns\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.441373 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439754 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c718a57-bd23-432b-bf19-493fd2ad600a-host-cni-bin\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.441373 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.439843 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-ovnkube-config\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.441373 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.440135 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c718a57-bd23-432b-bf19-493fd2ad600a-ovnkube-script-lib\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.442008 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.441985 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c718a57-bd23-432b-bf19-493fd2ad600a-ovn-node-metrics-cert\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.448479 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.448449 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjn7\" (UniqueName: \"kubernetes.io/projected/2c718a57-bd23-432b-bf19-493fd2ad600a-kube-api-access-xtjn7\") pod \"ovnkube-node-rdqjm\" (UID: \"2c718a57-bd23-432b-bf19-493fd2ad600a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.449177 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.449154 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfctx\" (UniqueName: \"kubernetes.io/projected/d19613ed-0faf-481f-bc0d-b4f8fcf0f259-kube-api-access-nfctx\") pod \"node-ca-t5mlw\" (UID: \"d19613ed-0faf-481f-bc0d-b4f8fcf0f259\") " pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.449281 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.449256 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4bm2\" (UniqueName: \"kubernetes.io/projected/0aa611e2-18d8-4712-9938-e8c21daeb1a0-kube-api-access-r4bm2\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:54.530621 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.530527 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" Apr 16 16:02:54.545761 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.545725 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-662l5" Apr 16 16:02:54.555509 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.555482 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sjpbs" Apr 16 16:02:54.561088 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.561067 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" Apr 16 16:02:54.568512 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.568495 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fph7l" Apr 16 16:02:54.575017 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.575002 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:02:54.584580 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.584558 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vq8l4" Apr 16 16:02:54.592130 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.592109 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t5mlw" Apr 16 16:02:54.597794 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.597775 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:02:54.943842 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.943757 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:54.943842 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:54.943806 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:02:54.944060 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.943922 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:54.944060 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.943938 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:54.944060 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.943952 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:54.944060 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.943962 2581 projected.go:194] Error preparing data for projected volume kube-api-access-6ktgq for pod openshift-network-diagnostics/network-check-target-m7vqg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:54.944060 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.943996 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:55.943978232 +0000 UTC m=+4.121039266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:54.944060 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:54.944019 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq podName:313de001-22f6-48de-8e2b-ba59ee1494ec nodeName:}" failed. No retries permitted until 2026-04-16 16:02:55.944004184 +0000 UTC m=+4.121065222 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktgq" (UniqueName: "kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq") pod "network-check-target-m7vqg" (UID: "313de001-22f6-48de-8e2b-ba59ee1494ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:55.102023 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:55.101994 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5012ef67_2e62_4b06_a2f2_ef785998d3cb.slice/crio-82c552265a7112ed13e167d82aff3854d47586bde3f884e05068d1783897b4c6 WatchSource:0}: Error finding container 82c552265a7112ed13e167d82aff3854d47586bde3f884e05068d1783897b4c6: Status 404 returned error can't find the container with id 82c552265a7112ed13e167d82aff3854d47586bde3f884e05068d1783897b4c6 Apr 16 16:02:55.103124 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:55.103092 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3bafcf_9b23_44de_8e7b_05e8fb94b9ee.slice/crio-085fc28f419b52cea1924d350850b30d3e51381aee7119e1feae572487f23373 WatchSource:0}: Error finding container 085fc28f419b52cea1924d350850b30d3e51381aee7119e1feae572487f23373: Status 404 returned error can't find the container with id 085fc28f419b52cea1924d350850b30d3e51381aee7119e1feae572487f23373 Apr 16 16:02:55.106222 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:55.106169 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c718a57_bd23_432b_bf19_493fd2ad600a.slice/crio-7cebcd33cabf92d00b3d708d6dd188b0682d561850db06b03523b03a3fc79379 WatchSource:0}: Error finding container 7cebcd33cabf92d00b3d708d6dd188b0682d561850db06b03523b03a3fc79379: Status 404 returned error can't find the container with id 7cebcd33cabf92d00b3d708d6dd188b0682d561850db06b03523b03a3fc79379 Apr 16 16:02:55.108225 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:55.108197 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70dfab46_81b5_47d9_b69d_a3d94a7c5e13.slice/crio-5d786caedbe1e7a8a4b6c06c551e064e811577d790ed1bd1894aec1106c19840 WatchSource:0}: Error finding container 5d786caedbe1e7a8a4b6c06c551e064e811577d790ed1bd1894aec1106c19840: Status 404 returned error can't find the container with id 5d786caedbe1e7a8a4b6c06c551e064e811577d790ed1bd1894aec1106c19840 Apr 16 16:02:55.109976 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:02:55.109859 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40d4bde4_af0f_486a_90e6_101c53fe3e24.slice/crio-8bedb21ff0139571c7082d9b67b324e00a5dd8ba25b670fa351f2a9e4417b2b9 WatchSource:0}: Error finding container 8bedb21ff0139571c7082d9b67b324e00a5dd8ba25b670fa351f2a9e4417b2b9: Status 404 returned error can't find the container with id 8bedb21ff0139571c7082d9b67b324e00a5dd8ba25b670fa351f2a9e4417b2b9 Apr 16 16:02:55.278484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.278220 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:57:53 +0000 UTC" deadline="2027-12-09 02:24:31.953828141 +0000 UTC" Apr 16 16:02:55.278484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.278416 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14434h21m36.675415639s" Apr 16 16:02:55.378137 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.378097 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t5mlw" event={"ID":"d19613ed-0faf-481f-bc0d-b4f8fcf0f259","Type":"ContainerStarted","Data":"cc7bf30ea37c468f9338c26744ed34446a531c4c57b6a03667a2c6f625c0ea3e"} Apr 16 16:02:55.381565 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.381525 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" event={"ID":"fe31cde4-f24b-44d8-9e19-ba426c58b544","Type":"ContainerStarted","Data":"a38cab524d4fea7bc0273dec35e982e0c66a5dc6b7d3bd6d507354c8b0916b1d"} Apr 16 16:02:55.382637 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.382611 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fph7l" event={"ID":"40d4bde4-af0f-486a-90e6-101c53fe3e24","Type":"ContainerStarted","Data":"8bedb21ff0139571c7082d9b67b324e00a5dd8ba25b670fa351f2a9e4417b2b9"} Apr 16 16:02:55.383533 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.383506 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"7cebcd33cabf92d00b3d708d6dd188b0682d561850db06b03523b03a3fc79379"} Apr 16 16:02:55.384493 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.384472 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-662l5" event={"ID":"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee","Type":"ContainerStarted","Data":"085fc28f419b52cea1924d350850b30d3e51381aee7119e1feae572487f23373"} Apr 16 16:02:55.385832 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.385808 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" event={"ID":"bb09a532e924a099c0b64a77b795e41f","Type":"ContainerStarted","Data":"0dc230b7976c88c94af5054c87b5761a2969281b02d716b42e21d2b282f971de"} Apr 16 16:02:55.386897 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.386876 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cqtjh" event={"ID":"ef56dfc1-1254-428e-ba95-899e4b0e0908","Type":"ContainerStarted","Data":"203313b60b3bf1ad6a67c8fc257afa527e75c1165744d3f5895f018d82aefcd7"} Apr 16 16:02:55.387744 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.387723 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vq8l4" event={"ID":"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c","Type":"ContainerStarted","Data":"14454e972d0893e459635090d3e732ec7ee762aaa46081f4359d96de19b5f646"} Apr 16 16:02:55.388749 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.388732 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sjpbs" event={"ID":"70dfab46-81b5-47d9-b69d-a3d94a7c5e13","Type":"ContainerStarted","Data":"5d786caedbe1e7a8a4b6c06c551e064e811577d790ed1bd1894aec1106c19840"} Apr 16 16:02:55.389650 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.389632 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" event={"ID":"5012ef67-2e62-4b06-a2f2-ef785998d3cb","Type":"ContainerStarted","Data":"82c552265a7112ed13e167d82aff3854d47586bde3f884e05068d1783897b4c6"} Apr 16 16:02:55.399845 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.399798 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" podStartSLOduration=2.399785908 podStartE2EDuration="2.399785908s" podCreationTimestamp="2026-04-16 16:02:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:02:55.399503494 +0000 UTC m=+3.576564552" watchObservedRunningTime="2026-04-16 16:02:55.399785908 +0000 UTC m=+3.576846970" Apr 16 16:02:55.954349 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.953545 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:55.954349 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:55.953608 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:02:55.954349 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:55.953748 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:55.954349 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:55.953765 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:55.954349 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:55.953777 2581 projected.go:194] Error preparing data for projected volume kube-api-access-6ktgq for pod openshift-network-diagnostics/network-check-target-m7vqg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:55.954349 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:55.953836 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq podName:313de001-22f6-48de-8e2b-ba59ee1494ec nodeName:}" failed. No retries permitted until 2026-04-16 16:02:57.953817624 +0000 UTC m=+6.130878672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktgq" (UniqueName: "kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq") pod "network-check-target-m7vqg" (UID: "313de001-22f6-48de-8e2b-ba59ee1494ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:55.954349 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:55.954241 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:55.954349 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:55.954293 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:57.95427562 +0000 UTC m=+6.131336669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:56.372386 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:56.371636 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:02:56.372386 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:56.371769 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:02:56.372386 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:56.372220 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:56.372386 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:56.372321 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:02:56.413790 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:56.413750 2581 generic.go:358] "Generic (PLEG): container finished" podID="6668c770a04eab2dde37a1f32511d490" containerID="48cda4495a5b1af59855b666af6ab1a1436aee38c11d0ceab5874600dcb2748d" exitCode=0 Apr 16 16:02:56.414416 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:56.414386 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" event={"ID":"6668c770a04eab2dde37a1f32511d490","Type":"ContainerDied","Data":"48cda4495a5b1af59855b666af6ab1a1436aee38c11d0ceab5874600dcb2748d"} Apr 16 16:02:57.421939 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:57.421881 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" event={"ID":"6668c770a04eab2dde37a1f32511d490","Type":"ContainerStarted","Data":"7633f00bbac291e18295a000bb858e5e0e1ab1ad53ed56f63086e8ac51b3eb6b"} Apr 16 16:02:57.445242 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:57.445190 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" podStartSLOduration=4.4451701759999995 podStartE2EDuration="4.445170176s" podCreationTimestamp="2026-04-16 16:02:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:02:57.443367321 +0000 UTC m=+5.620428378" watchObservedRunningTime="2026-04-16 16:02:57.445170176 +0000 UTC m=+5.622231232" Apr 16 16:02:57.970243 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:57.970203 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:57.970459 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:57.970269 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:02:57.970459 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:57.970447 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:57.970575 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:57.970467 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:57.970575 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:57.970481 2581 projected.go:194] Error preparing data for projected volume kube-api-access-6ktgq for pod openshift-network-diagnostics/network-check-target-m7vqg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:57.970575 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:57.970481 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:57.970575 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:57.970545 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq podName:313de001-22f6-48de-8e2b-ba59ee1494ec nodeName:}" failed. No retries permitted until 2026-04-16 16:03:01.970525435 +0000 UTC m=+10.147586481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktgq" (UniqueName: "kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq") pod "network-check-target-m7vqg" (UID: "313de001-22f6-48de-8e2b-ba59ee1494ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:57.970575 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:57.970568 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:01.97055781 +0000 UTC m=+10.147618850 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:58.372185 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:58.372097 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:02:58.372327 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:58.372246 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:02:58.372327 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:02:58.372297 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:02:58.372441 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:02:58.372396 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:00.372544 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:00.372255 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:00.372544 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:00.372395 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:00.372544 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:00.372434 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:00.373142 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:00.372550 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:01.461212 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.461171 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-r8wpm"] Apr 16 16:03:01.463942 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.463432 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.463942 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:01.463518 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:01.499975 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.499826 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.499975 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.499885 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-kubelet-config\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.499975 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.499914 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-dbus\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.601928 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.601083 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-dbus\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.601928 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.601205 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.601928 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.601243 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-kubelet-config\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.601928 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.601329 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-kubelet-config\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.601928 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:01.601509 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-dbus\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:01.601928 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:01.601617 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:01.601928 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:01.601683 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret podName:487d3578-03cc-4a7d-9a13-eb666eaf2cf2 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:02.101661257 +0000 UTC m=+10.278722298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret") pod "global-pull-secret-syncer-r8wpm" (UID: "487d3578-03cc-4a7d-9a13-eb666eaf2cf2") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:02.003953 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:02.003922 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:02.004138 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:02.004027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:02.004138 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.004094 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:02.004138 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.004115 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:02.004138 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.004128 2581 projected.go:194] Error preparing data for projected volume kube-api-access-6ktgq for pod openshift-network-diagnostics/network-check-target-m7vqg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:02.004360 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.004099 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:02.004360 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.004209 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq podName:313de001-22f6-48de-8e2b-ba59ee1494ec nodeName:}" failed. No retries permitted until 2026-04-16 16:03:10.004162269 +0000 UTC m=+18.181223325 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktgq" (UniqueName: "kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq") pod "network-check-target-m7vqg" (UID: "313de001-22f6-48de-8e2b-ba59ee1494ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:02.004360 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.004229 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:10.004218874 +0000 UTC m=+18.181279913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:02.104652 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:02.104614 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:02.104850 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.104721 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:02.104850 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.104764 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret podName:487d3578-03cc-4a7d-9a13-eb666eaf2cf2 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:03.104751391 +0000 UTC m=+11.281812424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret") pod "global-pull-secret-syncer-r8wpm" (UID: "487d3578-03cc-4a7d-9a13-eb666eaf2cf2") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:02.373297 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:02.373208 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:02.373481 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.373322 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:02.374103 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:02.374059 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:02.374225 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:02.374168 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:03.112572 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:03.112487 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:03.113040 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:03.112634 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:03.113040 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:03.112706 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret podName:487d3578-03cc-4a7d-9a13-eb666eaf2cf2 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:05.112690312 +0000 UTC m=+13.289751346 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret") pod "global-pull-secret-syncer-r8wpm" (UID: "487d3578-03cc-4a7d-9a13-eb666eaf2cf2") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:03.371386 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:03.371291 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:03.371552 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:03.371447 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:04.372479 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:04.372441 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:04.372878 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:04.372453 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:04.372878 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:04.372570 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:04.372878 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:04.372666 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:05.126815 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:05.126783 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:05.126978 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:05.126933 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:05.127049 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:05.126997 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret podName:487d3578-03cc-4a7d-9a13-eb666eaf2cf2 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:09.12697808 +0000 UTC m=+17.304039135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret") pod "global-pull-secret-syncer-r8wpm" (UID: "487d3578-03cc-4a7d-9a13-eb666eaf2cf2") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:05.371543 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:05.371506 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:05.371705 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:05.371636 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:06.371968 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:06.371927 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:06.372435 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:06.372052 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:06.372435 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:06.372106 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:06.372435 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:06.372210 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:07.371248 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:07.371213 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:07.371435 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:07.371368 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:08.371691 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:08.371274 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:08.371691 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:08.371437 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:08.371691 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:08.371502 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:08.371691 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:08.371629 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:09.155353 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:09.155304 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:09.155542 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:09.155475 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:09.155601 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:09.155551 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret podName:487d3578-03cc-4a7d-9a13-eb666eaf2cf2 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:17.155530499 +0000 UTC m=+25.332591547 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret") pod "global-pull-secret-syncer-r8wpm" (UID: "487d3578-03cc-4a7d-9a13-eb666eaf2cf2") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:09.371966 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:09.371934 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:09.372473 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:09.372045 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:10.061550 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:10.061516 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:10.061807 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:10.061560 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:10.061807 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:10.061672 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:10.061807 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:10.061682 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:10.061807 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:10.061698 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:10.061807 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:10.061710 2581 projected.go:194] Error preparing data for projected volume kube-api-access-6ktgq for pod openshift-network-diagnostics/network-check-target-m7vqg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:10.061807 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:10.061739 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:26.061720688 +0000 UTC m=+34.238781733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:10.061807 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:10.061760 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq podName:313de001-22f6-48de-8e2b-ba59ee1494ec nodeName:}" failed. No retries permitted until 2026-04-16 16:03:26.061749861 +0000 UTC m=+34.238810899 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktgq" (UniqueName: "kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq") pod "network-check-target-m7vqg" (UID: "313de001-22f6-48de-8e2b-ba59ee1494ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:10.371530 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:10.371446 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:10.371735 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:10.371446 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:10.371735 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:10.371602 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:10.371735 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:10.371678 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:11.371421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:11.371368 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:11.371878 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:11.371553 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:12.372097 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:12.372075 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:12.372445 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:12.372167 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:12.372445 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:12.372209 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:12.372445 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:12.372258 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:13.371931 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.371645 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:13.372089 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:13.372006 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:13.452925 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.452884 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sjpbs" event={"ID":"70dfab46-81b5-47d9-b69d-a3d94a7c5e13","Type":"ContainerStarted","Data":"84eb55586af3192d966adba4e8a380dee54546afdca96079821b1d493a1091ad"} Apr 16 16:03:13.454991 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.454966 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" event={"ID":"5012ef67-2e62-4b06-a2f2-ef785998d3cb","Type":"ContainerStarted","Data":"3d4ed33668d4abccfc9389b234fdae2dd7a92d9f727276eefe934a875cc7497f"} Apr 16 16:03:13.456440 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.456415 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t5mlw" event={"ID":"d19613ed-0faf-481f-bc0d-b4f8fcf0f259","Type":"ContainerStarted","Data":"6ae70906c3f8de626a18425ac647561e4ebd1fd679e6b082c19892d84ef82536"} Apr 16 16:03:13.457961 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.457933 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe31cde4-f24b-44d8-9e19-ba426c58b544" containerID="f58cdf26abeba1ff76d5edd5e5f32440e50a313b22826ea1871b73f7c7d5d6c1" exitCode=0 Apr 16 16:03:13.458076 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.458017 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" event={"ID":"fe31cde4-f24b-44d8-9e19-ba426c58b544","Type":"ContainerDied","Data":"f58cdf26abeba1ff76d5edd5e5f32440e50a313b22826ea1871b73f7c7d5d6c1"} Apr 16 16:03:13.459596 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.459514 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fph7l" event={"ID":"40d4bde4-af0f-486a-90e6-101c53fe3e24","Type":"ContainerStarted","Data":"7ba936b60b92b8ea93192947a08ce8631320f88dc52607f820a7e22aee904e9d"} Apr 16 16:03:13.462661 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.462638 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"5829287778b72a438ee9be003eb9acbc519dbfc82dbc7a4c0d39706b8e6e5a98"} Apr 16 16:03:13.462764 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.462668 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"df882c077fbfe8740cc647874330da442cddcdc25b90d133605d04865814354c"} Apr 16 16:03:13.462764 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.462682 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"c464f4291755bfefc019299aa1696307247ac1ae8d2ab087c74d3ec4c6c28d40"} Apr 16 16:03:13.462764 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.462694 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"abeb0428506efc529279e6a5443582e46a8700b03e935f03e262cdc7ff7c4dea"} Apr 16 16:03:13.462764 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.462710 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"3813aae73c6e660cbeda87c15b638d9d71f0ad8d7ef4157a336cb1f84c08be70"} Apr 16 16:03:13.462764 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.462723 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"76be459957e931780d3021111203138a838925893ae9e391bb4190f4766c78f9"} Apr 16 16:03:13.464135 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.464111 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-662l5" event={"ID":"2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee","Type":"ContainerStarted","Data":"1a40fde89df4c574a926ad52e81afe43686c15c1847c5adf9623c91b085b7266"} Apr 16 16:03:13.465491 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.465470 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cqtjh" event={"ID":"ef56dfc1-1254-428e-ba95-899e4b0e0908","Type":"ContainerStarted","Data":"650391dd9bfd0f919dc94e28e84571e064bc6b2a2df2ac578bbef5484dcb890a"} Apr 16 16:03:13.470004 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.469966 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sjpbs" podStartSLOduration=4.5071789639999995 podStartE2EDuration="21.469956135s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.111769866 +0000 UTC m=+3.288830913" lastFinishedPulling="2026-04-16 16:03:12.074547044 +0000 UTC m=+20.251608084" observedRunningTime="2026-04-16 16:03:13.469577164 +0000 UTC m=+21.646638220" watchObservedRunningTime="2026-04-16 16:03:13.469956135 +0000 UTC m=+21.647017190" Apr 16 16:03:13.490085 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.490049 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fph7l" podStartSLOduration=4.222585558 podStartE2EDuration="21.490037181s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.111619556 +0000 UTC m=+3.288680594" lastFinishedPulling="2026-04-16 16:03:12.379071169 +0000 UTC m=+20.556132217" observedRunningTime="2026-04-16 16:03:13.489757644 +0000 UTC m=+21.666818698" watchObservedRunningTime="2026-04-16 16:03:13.490037181 +0000 UTC m=+21.667098236" Apr 16 16:03:13.504255 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.504212 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-t5mlw" podStartSLOduration=12.462473647 podStartE2EDuration="21.504201109s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.117287168 +0000 UTC m=+3.294348208" lastFinishedPulling="2026-04-16 16:03:04.159014633 +0000 UTC m=+12.336075670" observedRunningTime="2026-04-16 16:03:13.504132274 +0000 UTC m=+21.681193343" watchObservedRunningTime="2026-04-16 16:03:13.504201109 +0000 UTC m=+21.681262164" Apr 16 16:03:13.532135 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.532086 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-662l5" podStartSLOduration=4.259431156 podStartE2EDuration="21.532068797s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.104999119 +0000 UTC m=+3.282060165" lastFinishedPulling="2026-04-16 16:03:12.377636767 +0000 UTC m=+20.554697806" observedRunningTime="2026-04-16 16:03:13.531485194 +0000 UTC m=+21.708546256" watchObservedRunningTime="2026-04-16 16:03:13.532068797 +0000 UTC m=+21.709129852" Apr 16 16:03:13.624966 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.624926 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cqtjh" podStartSLOduration=4.382701723 podStartE2EDuration="21.624912109s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.116535716 +0000 UTC m=+3.293596751" lastFinishedPulling="2026-04-16 16:03:12.358746098 +0000 UTC m=+20.535807137" observedRunningTime="2026-04-16 16:03:13.624539334 +0000 UTC m=+21.801600389" watchObservedRunningTime="2026-04-16 16:03:13.624912109 +0000 UTC m=+21.801973165" Apr 16 16:03:13.639065 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:13.639043 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:03:14.292302 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:14.292201 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:03:13.63906077Z","UUID":"cdfa7f7f-4bbf-478e-ae78-d10dc654f39e","Handler":null,"Name":"","Endpoint":""} Apr 16 16:03:14.294050 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:14.294028 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:03:14.294050 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:14.294058 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:03:14.371628 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:14.371595 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:14.371628 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:14.371614 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:14.371874 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:14.371721 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:14.371874 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:14.371799 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:14.469441 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:14.469405 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" event={"ID":"5012ef67-2e62-4b06-a2f2-ef785998d3cb","Type":"ContainerStarted","Data":"385b2a539eae8f73ab88df1909045def233f28a5a411c089adac0f34739cbfcc"} Apr 16 16:03:14.471199 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:14.471168 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vq8l4" event={"ID":"3143666f-7f83-4d6e-ae14-75fdaf4f8e7c","Type":"ContainerStarted","Data":"3da808ebecbd01d1c1a01af77d73465ddd5b5dc767a87d579988285057694204"} Apr 16 16:03:15.372218 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:15.372034 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:15.372401 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:15.372306 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:15.474872 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:15.474788 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" event={"ID":"5012ef67-2e62-4b06-a2f2-ef785998d3cb","Type":"ContainerStarted","Data":"07c2da10b7f491db7110087955151549c5314f340f712d36f8634781175370c0"} Apr 16 16:03:15.478660 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:15.478612 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"661ea13c01a358d00bd90b9dd8747ae39a334ee82250de172de73b437ef9a680"} Apr 16 16:03:15.496484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:15.496436 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c6kvm" podStartSLOduration=3.759971896 podStartE2EDuration="23.496417717s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.103972428 +0000 UTC m=+3.281033475" lastFinishedPulling="2026-04-16 16:03:14.840418258 +0000 UTC m=+23.017479296" observedRunningTime="2026-04-16 16:03:15.49626416 +0000 UTC m=+23.673325228" watchObservedRunningTime="2026-04-16 16:03:15.496417717 +0000 UTC m=+23.673478772" Apr 16 16:03:15.496901 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:15.496875 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vq8l4" podStartSLOduration=6.251937188 podStartE2EDuration="23.496867233s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.113817117 +0000 UTC m=+3.290878152" lastFinishedPulling="2026-04-16 16:03:12.358747149 +0000 UTC m=+20.535808197" observedRunningTime="2026-04-16 16:03:14.486943869 +0000 UTC m=+22.664004924" watchObservedRunningTime="2026-04-16 16:03:15.496867233 +0000 UTC m=+23.673928289" Apr 16 16:03:16.372181 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:16.372147 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:16.372368 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:16.372272 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:16.372368 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:16.372326 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:16.373833 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:16.372752 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:17.216566 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.216520 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:17.217252 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:17.216691 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:17.217252 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:17.216759 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret podName:487d3578-03cc-4a7d-9a13-eb666eaf2cf2 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:33.216744 +0000 UTC m=+41.393805033 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret") pod "global-pull-secret-syncer-r8wpm" (UID: "487d3578-03cc-4a7d-9a13-eb666eaf2cf2") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:17.371583 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.371556 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:17.371730 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:17.371650 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:17.487250 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.486985 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" event={"ID":"2c718a57-bd23-432b-bf19-493fd2ad600a","Type":"ContainerStarted","Data":"a41fad685507d593739b12a4e7e48f101ba72aaf329f9506c110e45bdb1d1e3d"} Apr 16 16:03:17.487555 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.487367 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:03:17.487555 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.487500 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:03:17.487555 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.487517 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:03:17.509943 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.509795 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:03:17.509943 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.509900 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:03:17.523471 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:17.523298 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" podStartSLOduration=7.840410272 podStartE2EDuration="25.523279355s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.108247244 +0000 UTC m=+3.285308278" lastFinishedPulling="2026-04-16 16:03:12.791116327 +0000 UTC m=+20.968177361" observedRunningTime="2026-04-16 16:03:17.521588183 +0000 UTC m=+25.698649237" watchObservedRunningTime="2026-04-16 16:03:17.523279355 +0000 UTC m=+25.700340412" Apr 16 16:03:18.368092 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:18.368064 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:03:18.368690 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:18.368672 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:03:18.371754 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:18.371738 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:18.371847 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:18.371831 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:18.371921 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:18.371905 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:18.372081 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:18.372062 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:18.489993 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:18.489958 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe31cde4-f24b-44d8-9e19-ba426c58b544" containerID="2a67aa5ac70c37ebef3324a35a19743d8549c6eb70bb1cb95c7a942d548a1322" exitCode=0 Apr 16 16:03:18.490137 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:18.490046 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" event={"ID":"fe31cde4-f24b-44d8-9e19-ba426c58b544","Type":"ContainerDied","Data":"2a67aa5ac70c37ebef3324a35a19743d8549c6eb70bb1cb95c7a942d548a1322"} Apr 16 16:03:18.491267 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:18.490665 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:03:18.491267 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:18.490825 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cqtjh" Apr 16 16:03:19.372142 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.372110 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:19.372538 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:19.372227 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:19.383274 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.383246 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rgfkx"] Apr 16 16:03:19.383422 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.383386 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:19.383494 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:19.383473 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:19.386455 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.386415 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m7vqg"] Apr 16 16:03:19.386569 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.386529 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:19.386647 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:19.386630 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:19.386962 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.386942 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r8wpm"] Apr 16 16:03:19.494020 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.493929 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe31cde4-f24b-44d8-9e19-ba426c58b544" containerID="5a0a9c348774346ad80a465d900d17281d48efb50dc9827104fca783558ec3d8" exitCode=0 Apr 16 16:03:19.494164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.494038 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" event={"ID":"fe31cde4-f24b-44d8-9e19-ba426c58b544","Type":"ContainerDied","Data":"5a0a9c348774346ad80a465d900d17281d48efb50dc9827104fca783558ec3d8"} Apr 16 16:03:19.494164 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:19.494104 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:19.494259 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:19.494186 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:20.498121 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:20.498089 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe31cde4-f24b-44d8-9e19-ba426c58b544" containerID="cbe67117d72bad72a332ac5b829a04d1e987ec4f81c5590b3c54b86c4757a852" exitCode=0 Apr 16 16:03:20.498819 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:20.498156 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" event={"ID":"fe31cde4-f24b-44d8-9e19-ba426c58b544","Type":"ContainerDied","Data":"cbe67117d72bad72a332ac5b829a04d1e987ec4f81c5590b3c54b86c4757a852"} Apr 16 16:03:21.371736 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:21.371699 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:21.371926 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:21.371699 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:21.371926 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:21.371811 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:21.371926 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:21.371830 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:21.372082 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:21.371942 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:21.372082 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:21.372031 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:23.371774 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:23.371739 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:23.372588 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:23.371739 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:23.372588 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:23.371739 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:23.372588 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:23.371898 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:23.372588 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:23.372003 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:23.372588 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:23.372099 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:25.371632 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.371598 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:25.372110 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:25.371735 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-r8wpm" podUID="487d3578-03cc-4a7d-9a13-eb666eaf2cf2" Apr 16 16:03:25.372110 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.371808 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:25.372110 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:25.371912 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m7vqg" podUID="313de001-22f6-48de-8e2b-ba59ee1494ec" Apr 16 16:03:25.372110 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.371966 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:25.372110 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:25.372044 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:03:25.681681 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.681644 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeReady" Apr 16 16:03:25.681869 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.681783 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:03:25.741838 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.741800 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dlmbb"] Apr 16 16:03:25.771827 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.771784 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t9bw7"] Apr 16 16:03:25.772019 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.771964 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:25.774239 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.774212 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:03:25.774380 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.774221 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:03:25.774816 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.774605 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:03:25.774816 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.774606 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wjzs4\"" Apr 16 16:03:25.789728 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.789535 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dlmbb"] Apr 16 16:03:25.789728 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.789741 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t9bw7"] Apr 16 16:03:25.789921 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.789705 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.792259 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.792084 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v9cqx\"" Apr 16 16:03:25.792259 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.792101 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:03:25.792259 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.792146 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:03:25.882378 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.882325 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f1e4962-e3c5-4ee3-952d-c5105193db44-config-volume\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.882559 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.882406 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dhr4\" (UniqueName: \"kubernetes.io/projected/2f1e4962-e3c5-4ee3-952d-c5105193db44-kube-api-access-7dhr4\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.882559 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.882452 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f1e4962-e3c5-4ee3-952d-c5105193db44-tmp-dir\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.882559 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.882482 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:25.882559 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.882530 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.882559 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.882549 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc4v8\" (UniqueName: \"kubernetes.io/projected/59a52d95-8872-4fa3-b620-5f948a8a6e16-kube-api-access-wc4v8\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:25.983369 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.983268 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.983369 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.983318 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc4v8\" (UniqueName: \"kubernetes.io/projected/59a52d95-8872-4fa3-b620-5f948a8a6e16-kube-api-access-wc4v8\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:25.983623 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.983371 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f1e4962-e3c5-4ee3-952d-c5105193db44-config-volume\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.983623 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.983416 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dhr4\" (UniqueName: \"kubernetes.io/projected/2f1e4962-e3c5-4ee3-952d-c5105193db44-kube-api-access-7dhr4\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.983623 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:25.983439 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:25.983623 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.983471 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f1e4962-e3c5-4ee3-952d-c5105193db44-tmp-dir\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.983623 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.983498 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:25.983623 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:25.983518 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls podName:2f1e4962-e3c5-4ee3-952d-c5105193db44 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:26.483494362 +0000 UTC m=+34.660555422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls") pod "dns-default-t9bw7" (UID: "2f1e4962-e3c5-4ee3-952d-c5105193db44") : secret "dns-default-metrics-tls" not found Apr 16 16:03:25.983623 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:25.983568 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:25.983623 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:25.983614 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert podName:59a52d95-8872-4fa3-b620-5f948a8a6e16 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:26.483601021 +0000 UTC m=+34.660662071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert") pod "ingress-canary-dlmbb" (UID: "59a52d95-8872-4fa3-b620-5f948a8a6e16") : secret "canary-serving-cert" not found Apr 16 16:03:25.983969 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.983953 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f1e4962-e3c5-4ee3-952d-c5105193db44-tmp-dir\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.984120 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.984090 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f1e4962-e3c5-4ee3-952d-c5105193db44-config-volume\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.995649 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.995622 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dhr4\" (UniqueName: \"kubernetes.io/projected/2f1e4962-e3c5-4ee3-952d-c5105193db44-kube-api-access-7dhr4\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:25.995807 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:25.995740 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc4v8\" (UniqueName: \"kubernetes.io/projected/59a52d95-8872-4fa3-b620-5f948a8a6e16-kube-api-access-wc4v8\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:26.084226 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:26.084190 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:26.084226 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:26.084236 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:26.084479 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.084371 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:26.084479 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.084380 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:26.084479 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.084467 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:58.084443736 +0000 UTC m=+66.261504785 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:26.084479 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.084471 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:26.084479 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.084484 2581 projected.go:194] Error preparing data for projected volume kube-api-access-6ktgq for pod openshift-network-diagnostics/network-check-target-m7vqg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:26.084719 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.084527 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq podName:313de001-22f6-48de-8e2b-ba59ee1494ec nodeName:}" failed. No retries permitted until 2026-04-16 16:03:58.084516836 +0000 UTC m=+66.261577869 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktgq" (UniqueName: "kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq") pod "network-check-target-m7vqg" (UID: "313de001-22f6-48de-8e2b-ba59ee1494ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:26.486972 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:26.486929 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:26.487533 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:26.487026 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:26.487533 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.487269 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:26.487651 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.487543 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls podName:2f1e4962-e3c5-4ee3-952d-c5105193db44 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:27.487508058 +0000 UTC m=+35.664569091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls") pod "dns-default-t9bw7" (UID: "2f1e4962-e3c5-4ee3-952d-c5105193db44") : secret "dns-default-metrics-tls" not found Apr 16 16:03:26.487651 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.487540 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:26.487651 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:26.487593 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert podName:59a52d95-8872-4fa3-b620-5f948a8a6e16 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:27.487575909 +0000 UTC m=+35.664636942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert") pod "ingress-canary-dlmbb" (UID: "59a52d95-8872-4fa3-b620-5f948a8a6e16") : secret "canary-serving-cert" not found Apr 16 16:03:27.372050 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.372014 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:27.372239 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.372126 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:27.372381 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.372356 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:27.374830 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.374811 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:03:27.374830 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.374826 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:03:27.375638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.375621 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:03:27.375638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.375598 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x7n6v\"" Apr 16 16:03:27.375764 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.375642 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:03:27.375764 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.375643 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vp75c\"" Apr 16 16:03:27.495937 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.495848 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:27.496309 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.495959 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:27.496309 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:27.496019 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:27.496309 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:27.496112 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert podName:59a52d95-8872-4fa3-b620-5f948a8a6e16 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:29.496091829 +0000 UTC m=+37.673152864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert") pod "ingress-canary-dlmbb" (UID: "59a52d95-8872-4fa3-b620-5f948a8a6e16") : secret "canary-serving-cert" not found Apr 16 16:03:27.496309 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:27.496120 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:27.496309 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:27.496182 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls podName:2f1e4962-e3c5-4ee3-952d-c5105193db44 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:29.496165178 +0000 UTC m=+37.673226225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls") pod "dns-default-t9bw7" (UID: "2f1e4962-e3c5-4ee3-952d-c5105193db44") : secret "dns-default-metrics-tls" not found Apr 16 16:03:27.514240 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.514211 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe31cde4-f24b-44d8-9e19-ba426c58b544" containerID="fa76f646473a335e332911a577fbe0293bc3ba306634e01a185b9889eff0423a" exitCode=0 Apr 16 16:03:27.514412 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:27.514263 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" event={"ID":"fe31cde4-f24b-44d8-9e19-ba426c58b544","Type":"ContainerDied","Data":"fa76f646473a335e332911a577fbe0293bc3ba306634e01a185b9889eff0423a"} Apr 16 16:03:28.518901 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:28.518869 2581 generic.go:358] "Generic (PLEG): container finished" podID="fe31cde4-f24b-44d8-9e19-ba426c58b544" containerID="8fd1ff0c6eaa14e33a22a9ae8a1ccfeec3d6fa393378fd395604e65c35723ad6" exitCode=0 Apr 16 16:03:28.519314 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:28.518922 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" event={"ID":"fe31cde4-f24b-44d8-9e19-ba426c58b544","Type":"ContainerDied","Data":"8fd1ff0c6eaa14e33a22a9ae8a1ccfeec3d6fa393378fd395604e65c35723ad6"} Apr 16 16:03:29.510646 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:29.510608 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:29.510821 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:29.510664 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:29.510821 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:29.510770 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:29.510907 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:29.510836 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert podName:59a52d95-8872-4fa3-b620-5f948a8a6e16 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:33.510820816 +0000 UTC m=+41.687881863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert") pod "ingress-canary-dlmbb" (UID: "59a52d95-8872-4fa3-b620-5f948a8a6e16") : secret "canary-serving-cert" not found Apr 16 16:03:29.510907 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:29.510776 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:29.510907 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:29.510907 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls podName:2f1e4962-e3c5-4ee3-952d-c5105193db44 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:33.51089444 +0000 UTC m=+41.687955473 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls") pod "dns-default-t9bw7" (UID: "2f1e4962-e3c5-4ee3-952d-c5105193db44") : secret "dns-default-metrics-tls" not found Apr 16 16:03:29.523778 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:29.523739 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" event={"ID":"fe31cde4-f24b-44d8-9e19-ba426c58b544","Type":"ContainerStarted","Data":"5f19755cbda5c2b18bcb023911c8bc609a5f641c27e6aa0f516ec091b839d33f"} Apr 16 16:03:29.550806 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:29.550760 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ml8fq" podStartSLOduration=6.171779007 podStartE2EDuration="37.550746541s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:02:55.113499246 +0000 UTC m=+3.290560280" lastFinishedPulling="2026-04-16 16:03:26.492466781 +0000 UTC m=+34.669527814" observedRunningTime="2026-04-16 16:03:29.549493096 +0000 UTC m=+37.726554183" watchObservedRunningTime="2026-04-16 16:03:29.550746541 +0000 UTC m=+37.727807597" Apr 16 16:03:33.238452 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:33.238407 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:33.241846 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:33.241820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/487d3578-03cc-4a7d-9a13-eb666eaf2cf2-original-pull-secret\") pod \"global-pull-secret-syncer-r8wpm\" (UID: \"487d3578-03cc-4a7d-9a13-eb666eaf2cf2\") " pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:33.381296 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:33.381257 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r8wpm" Apr 16 16:03:33.540897 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:33.540862 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:33.541031 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:33.540942 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:33.541031 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:33.540966 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:33.541031 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:33.541007 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert podName:59a52d95-8872-4fa3-b620-5f948a8a6e16 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:41.540987642 +0000 UTC m=+49.718048697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert") pod "ingress-canary-dlmbb" (UID: "59a52d95-8872-4fa3-b620-5f948a8a6e16") : secret "canary-serving-cert" not found Apr 16 16:03:33.541152 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:33.541069 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:33.541152 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:33.541110 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls podName:2f1e4962-e3c5-4ee3-952d-c5105193db44 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:41.541098525 +0000 UTC m=+49.718159573 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls") pod "dns-default-t9bw7" (UID: "2f1e4962-e3c5-4ee3-952d-c5105193db44") : secret "dns-default-metrics-tls" not found Apr 16 16:03:33.640515 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:33.640477 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r8wpm"] Apr 16 16:03:33.644100 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:03:33.644071 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487d3578_03cc_4a7d_9a13_eb666eaf2cf2.slice/crio-8d151669f34e1064c583b14724a7fd3f36beaa8390f72b1ec3c7fd4c9fbcaaf8 WatchSource:0}: Error finding container 8d151669f34e1064c583b14724a7fd3f36beaa8390f72b1ec3c7fd4c9fbcaaf8: Status 404 returned error can't find the container with id 8d151669f34e1064c583b14724a7fd3f36beaa8390f72b1ec3c7fd4c9fbcaaf8 Apr 16 16:03:34.534142 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:34.534098 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r8wpm" event={"ID":"487d3578-03cc-4a7d-9a13-eb666eaf2cf2","Type":"ContainerStarted","Data":"8d151669f34e1064c583b14724a7fd3f36beaa8390f72b1ec3c7fd4c9fbcaaf8"} Apr 16 16:03:37.541702 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:37.541672 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r8wpm" event={"ID":"487d3578-03cc-4a7d-9a13-eb666eaf2cf2","Type":"ContainerStarted","Data":"902331b970029d33249213adf2843777c4548bb73def52b91008c4e8ed99b488"} Apr 16 16:03:38.563909 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:38.563855 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-r8wpm" podStartSLOduration=33.79867867 podStartE2EDuration="37.563840116s" podCreationTimestamp="2026-04-16 16:03:01 +0000 UTC" firstStartedPulling="2026-04-16 16:03:33.645764988 +0000 UTC m=+41.822826020" lastFinishedPulling="2026-04-16 16:03:37.410926434 +0000 UTC m=+45.587987466" observedRunningTime="2026-04-16 16:03:38.562563961 +0000 UTC m=+46.739625016" watchObservedRunningTime="2026-04-16 16:03:38.563840116 +0000 UTC m=+46.740901170" Apr 16 16:03:41.601982 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:41.601945 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:41.602438 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:41.602013 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:41.602438 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:41.602101 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:41.602438 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:41.602119 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:41.602438 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:41.602160 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert podName:59a52d95-8872-4fa3-b620-5f948a8a6e16 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:57.60214692 +0000 UTC m=+65.779207953 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert") pod "ingress-canary-dlmbb" (UID: "59a52d95-8872-4fa3-b620-5f948a8a6e16") : secret "canary-serving-cert" not found Apr 16 16:03:41.602438 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:41.602196 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls podName:2f1e4962-e3c5-4ee3-952d-c5105193db44 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:57.602176341 +0000 UTC m=+65.779237373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls") pod "dns-default-t9bw7" (UID: "2f1e4962-e3c5-4ee3-952d-c5105193db44") : secret "dns-default-metrics-tls" not found Apr 16 16:03:49.507393 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:49.507358 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rdqjm" Apr 16 16:03:53.984049 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:53.984015 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8"] Apr 16 16:03:53.986801 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:53.986782 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:53.989588 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:53.989555 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:03:53.989588 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:53.989557 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 16:03:53.989783 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:53.989593 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:03:53.989783 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:53.989616 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:03:53.999941 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:53.998322 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8"] Apr 16 16:03:54.019288 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.019263 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5"] Apr 16 16:03:54.021322 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.021303 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.024287 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.024267 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 16:03:54.024958 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.024941 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 16:03:54.025098 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.025081 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 16:03:54.025190 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.025175 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 16:03:54.039019 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.038997 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5"] Apr 16 16:03:54.089261 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089231 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fb7a564-3e72-44f6-876e-70c03d7b0004-tmp\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.089411 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089272 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.089411 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089308 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9fb7a564-3e72-44f6-876e-70c03d7b0004-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.089411 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089349 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkq92\" (UniqueName: \"kubernetes.io/projected/9fb7a564-3e72-44f6-876e-70c03d7b0004-kube-api-access-gkq92\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.089411 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089372 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.089411 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089407 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-hub\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.089592 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089426 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vl6\" (UniqueName: \"kubernetes.io/projected/c30b47dc-abdc-404e-b585-3f77ecec1e5e-kube-api-access-77vl6\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.089592 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089512 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-ca\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.089592 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.089572 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c30b47dc-abdc-404e-b585-3f77ecec1e5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.190873 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.190835 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c30b47dc-abdc-404e-b585-3f77ecec1e5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.190873 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.190876 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fb7a564-3e72-44f6-876e-70c03d7b0004-tmp\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.191080 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.190899 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.191154 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.191127 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9fb7a564-3e72-44f6-876e-70c03d7b0004-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.191209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.191185 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkq92\" (UniqueName: \"kubernetes.io/projected/9fb7a564-3e72-44f6-876e-70c03d7b0004-kube-api-access-gkq92\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.191266 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.191216 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.191266 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.191239 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fb7a564-3e72-44f6-876e-70c03d7b0004-tmp\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.191419 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.191400 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-hub\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.191487 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.191433 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77vl6\" (UniqueName: \"kubernetes.io/projected/c30b47dc-abdc-404e-b585-3f77ecec1e5e-kube-api-access-77vl6\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.191487 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.191469 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-ca\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.191675 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.191654 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c30b47dc-abdc-404e-b585-3f77ecec1e5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.193576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.193543 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.193576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.193561 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-ca\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.193750 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.193686 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9fb7a564-3e72-44f6-876e-70c03d7b0004-klusterlet-config\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.194019 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.193998 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.194192 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.194175 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c30b47dc-abdc-404e-b585-3f77ecec1e5e-hub\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.199003 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.198977 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vl6\" (UniqueName: \"kubernetes.io/projected/c30b47dc-abdc-404e-b585-3f77ecec1e5e-kube-api-access-77vl6\") pod \"cluster-proxy-proxy-agent-548df49cd9-td9t5\" (UID: \"c30b47dc-abdc-404e-b585-3f77ecec1e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.199546 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.199525 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkq92\" (UniqueName: \"kubernetes.io/projected/9fb7a564-3e72-44f6-876e-70c03d7b0004-kube-api-access-gkq92\") pod \"klusterlet-addon-workmgr-5f899cc7b5-ts5r8\" (UID: \"9fb7a564-3e72-44f6-876e-70c03d7b0004\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.296768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.296739 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:54.347861 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.347826 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:03:54.420853 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.420817 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8"] Apr 16 16:03:54.425061 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:03:54.425029 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb7a564_3e72_44f6_876e_70c03d7b0004.slice/crio-e2a3d165bc6252a3bfec2fff1495c973e3a513e607d5477d428476724c00b542 WatchSource:0}: Error finding container e2a3d165bc6252a3bfec2fff1495c973e3a513e607d5477d428476724c00b542: Status 404 returned error can't find the container with id e2a3d165bc6252a3bfec2fff1495c973e3a513e607d5477d428476724c00b542 Apr 16 16:03:54.469804 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.469771 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5"] Apr 16 16:03:54.474593 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:03:54.474556 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc30b47dc_abdc_404e_b585_3f77ecec1e5e.slice/crio-499574c3ef5acddac79eb4ebd6a6933db2e588a42bcdc40d7f777afa3ea36ac8 WatchSource:0}: Error finding container 499574c3ef5acddac79eb4ebd6a6933db2e588a42bcdc40d7f777afa3ea36ac8: Status 404 returned error can't find the container with id 499574c3ef5acddac79eb4ebd6a6933db2e588a42bcdc40d7f777afa3ea36ac8 Apr 16 16:03:54.576530 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.576445 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" event={"ID":"9fb7a564-3e72-44f6-876e-70c03d7b0004","Type":"ContainerStarted","Data":"e2a3d165bc6252a3bfec2fff1495c973e3a513e607d5477d428476724c00b542"} Apr 16 16:03:54.577401 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:54.577381 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" event={"ID":"c30b47dc-abdc-404e-b585-3f77ecec1e5e","Type":"ContainerStarted","Data":"499574c3ef5acddac79eb4ebd6a6933db2e588a42bcdc40d7f777afa3ea36ac8"} Apr 16 16:03:57.620478 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:57.620446 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:03:57.620917 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:57.620492 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:03:57.620917 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:57.620588 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:57.620917 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:57.620651 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert podName:59a52d95-8872-4fa3-b620-5f948a8a6e16 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:29.620636919 +0000 UTC m=+97.797697952 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert") pod "ingress-canary-dlmbb" (UID: "59a52d95-8872-4fa3-b620-5f948a8a6e16") : secret "canary-serving-cert" not found Apr 16 16:03:57.620917 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:57.620596 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:57.620917 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:57.620721 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls podName:2f1e4962-e3c5-4ee3-952d-c5105193db44 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:29.62070847 +0000 UTC m=+97.797769513 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls") pod "dns-default-t9bw7" (UID: "2f1e4962-e3c5-4ee3-952d-c5105193db44") : secret "dns-default-metrics-tls" not found Apr 16 16:03:58.124888 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:58.124848 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:03:58.124888 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:58.124894 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:58.127693 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:58.127666 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:03:58.127819 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:58.127719 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:03:58.136084 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:58.136063 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:03:58.136199 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:03:58.136132 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:02.136116394 +0000 UTC m=+130.313177426 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : secret "metrics-daemon-secret" not found Apr 16 16:03:58.138604 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:58.138581 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:03:58.149429 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:58.149408 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ktgq\" (UniqueName: \"kubernetes.io/projected/313de001-22f6-48de-8e2b-ba59ee1494ec-kube-api-access-6ktgq\") pod \"network-check-target-m7vqg\" (UID: \"313de001-22f6-48de-8e2b-ba59ee1494ec\") " pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:58.290410 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:58.290369 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x7n6v\"" Apr 16 16:03:58.297597 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:58.297566 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:03:59.097790 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:59.097755 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m7vqg"] Apr 16 16:03:59.100933 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:03:59.100909 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod313de001_22f6_48de_8e2b_ba59ee1494ec.slice/crio-4220dfe730bb8f929bd5655012c4cb1def510c580b8343dde3426497789d18ca WatchSource:0}: Error finding container 4220dfe730bb8f929bd5655012c4cb1def510c580b8343dde3426497789d18ca: Status 404 returned error can't find the container with id 4220dfe730bb8f929bd5655012c4cb1def510c580b8343dde3426497789d18ca Apr 16 16:03:59.588491 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:59.588453 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m7vqg" event={"ID":"313de001-22f6-48de-8e2b-ba59ee1494ec","Type":"ContainerStarted","Data":"4220dfe730bb8f929bd5655012c4cb1def510c580b8343dde3426497789d18ca"} Apr 16 16:03:59.589969 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:59.589929 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" event={"ID":"9fb7a564-3e72-44f6-876e-70c03d7b0004","Type":"ContainerStarted","Data":"918f91d815e3e9b769843d74eabb615a0bb442cd18aff26a07de80fc5a30785d"} Apr 16 16:03:59.590165 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:59.590141 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:59.591447 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:59.591423 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" event={"ID":"c30b47dc-abdc-404e-b585-3f77ecec1e5e","Type":"ContainerStarted","Data":"a66ff59d8dd8afc0e0b300a1557b68533b4945029c955c8cca39e7803cacf843"} Apr 16 16:03:59.591826 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:59.591809 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" Apr 16 16:03:59.610620 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:03:59.610566 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5f899cc7b5-ts5r8" podStartSLOduration=2.041559468 podStartE2EDuration="6.610527434s" podCreationTimestamp="2026-04-16 16:03:53 +0000 UTC" firstStartedPulling="2026-04-16 16:03:54.42711757 +0000 UTC m=+62.604178605" lastFinishedPulling="2026-04-16 16:03:58.996085523 +0000 UTC m=+67.173146571" observedRunningTime="2026-04-16 16:03:59.610140128 +0000 UTC m=+67.787201184" watchObservedRunningTime="2026-04-16 16:03:59.610527434 +0000 UTC m=+67.787588490" Apr 16 16:04:02.601357 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:02.601301 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" event={"ID":"c30b47dc-abdc-404e-b585-3f77ecec1e5e","Type":"ContainerStarted","Data":"db8dc57c871cb2fd47db044e0e285c038f5329822f5c0e0ccb510be6d7eddb00"} Apr 16 16:04:02.601357 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:02.601357 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" event={"ID":"c30b47dc-abdc-404e-b585-3f77ecec1e5e","Type":"ContainerStarted","Data":"a6c7b6d33f991f9653852a21f1669178d986335350d885b147e680d5bc818d11"} Apr 16 16:04:02.602657 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:02.602630 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m7vqg" event={"ID":"313de001-22f6-48de-8e2b-ba59ee1494ec","Type":"ContainerStarted","Data":"4f89b89b7445e4edf374a662cda1ba0213d47c179049c4af9aedfaec8713618e"} Apr 16 16:04:02.602819 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:02.602804 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:04:02.622244 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:02.622196 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" podStartSLOduration=2.014420013 podStartE2EDuration="9.622180801s" podCreationTimestamp="2026-04-16 16:03:53 +0000 UTC" firstStartedPulling="2026-04-16 16:03:54.476269895 +0000 UTC m=+62.653330929" lastFinishedPulling="2026-04-16 16:04:02.08403067 +0000 UTC m=+70.261091717" observedRunningTime="2026-04-16 16:04:02.621098961 +0000 UTC m=+70.798160017" watchObservedRunningTime="2026-04-16 16:04:02.622180801 +0000 UTC m=+70.799241889" Apr 16 16:04:02.636605 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:02.636557 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m7vqg" podStartSLOduration=67.65143928 podStartE2EDuration="1m10.636544311s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:03:59.104492988 +0000 UTC m=+67.281554035" lastFinishedPulling="2026-04-16 16:04:02.089598028 +0000 UTC m=+70.266659066" observedRunningTime="2026-04-16 16:04:02.636240159 +0000 UTC m=+70.813301213" watchObservedRunningTime="2026-04-16 16:04:02.636544311 +0000 UTC m=+70.813605366" Apr 16 16:04:29.656921 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:29.656845 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:04:29.657384 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:29.656979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:04:29.657384 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:04:29.656995 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:04:29.657384 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:04:29.657086 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls podName:2f1e4962-e3c5-4ee3-952d-c5105193db44 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:33.657063618 +0000 UTC m=+161.834124671 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls") pod "dns-default-t9bw7" (UID: "2f1e4962-e3c5-4ee3-952d-c5105193db44") : secret "dns-default-metrics-tls" not found Apr 16 16:04:29.657384 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:04:29.657089 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:04:29.657384 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:04:29.657132 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert podName:59a52d95-8872-4fa3-b620-5f948a8a6e16 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:33.6571217 +0000 UTC m=+161.834182743 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert") pod "ingress-canary-dlmbb" (UID: "59a52d95-8872-4fa3-b620-5f948a8a6e16") : secret "canary-serving-cert" not found Apr 16 16:04:33.607688 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:04:33.607662 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m7vqg" Apr 16 16:05:02.186767 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:02.186730 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:05:02.187250 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:05:02.186848 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:05:02.187250 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:05:02.186903 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs podName:0aa611e2-18d8-4712-9938-e8c21daeb1a0 nodeName:}" failed. No retries permitted until 2026-04-16 16:07:04.186888692 +0000 UTC m=+252.363949730 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs") pod "network-metrics-daemon-rgfkx" (UID: "0aa611e2-18d8-4712-9938-e8c21daeb1a0") : secret "metrics-daemon-secret" not found Apr 16 16:05:25.251252 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:25.251224 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sjpbs_70dfab46-81b5-47d9-b69d-a3d94a7c5e13/dns-node-resolver/0.log" Apr 16 16:05:26.052201 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:26.052175 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t5mlw_d19613ed-0faf-481f-bc0d-b4f8fcf0f259/node-ca/0.log" Apr 16 16:05:28.783531 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:05:28.783474 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dlmbb" podUID="59a52d95-8872-4fa3-b620-5f948a8a6e16" Apr 16 16:05:28.799641 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:05:28.799612 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-t9bw7" podUID="2f1e4962-e3c5-4ee3-952d-c5105193db44" Apr 16 16:05:28.801218 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:28.801197 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:05:29.802986 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:29.802953 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t9bw7" Apr 16 16:05:30.391555 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:05:30.391520 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rgfkx" podUID="0aa611e2-18d8-4712-9938-e8c21daeb1a0" Apr 16 16:05:33.700158 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.700114 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:05:33.700158 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.700168 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:05:33.702505 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.702486 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f1e4962-e3c5-4ee3-952d-c5105193db44-metrics-tls\") pod \"dns-default-t9bw7\" (UID: \"2f1e4962-e3c5-4ee3-952d-c5105193db44\") " pod="openshift-dns/dns-default-t9bw7" Apr 16 16:05:33.702617 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.702598 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a52d95-8872-4fa3-b620-5f948a8a6e16-cert\") pod \"ingress-canary-dlmbb\" (UID: \"59a52d95-8872-4fa3-b620-5f948a8a6e16\") " pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:05:33.705926 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.705910 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v9cqx\"" Apr 16 16:05:33.714154 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.714134 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t9bw7" Apr 16 16:05:33.838145 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.838115 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t9bw7"] Apr 16 16:05:33.841980 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:05:33.841935 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1e4962_e3c5_4ee3_952d_c5105193db44.slice/crio-1623b6aa322a5b030928fefea27a37227f23d53df32e533cc67d721a0ac5307e WatchSource:0}: Error finding container 1623b6aa322a5b030928fefea27a37227f23d53df32e533cc67d721a0ac5307e: Status 404 returned error can't find the container with id 1623b6aa322a5b030928fefea27a37227f23d53df32e533cc67d721a0ac5307e Apr 16 16:05:33.904570 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.904545 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wjzs4\"" Apr 16 16:05:33.912617 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:33.912600 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dlmbb" Apr 16 16:05:34.032882 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:34.032851 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dlmbb"] Apr 16 16:05:34.036157 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:05:34.036130 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a52d95_8872_4fa3_b620_5f948a8a6e16.slice/crio-9610203e39c71e2a1719b0bddf581c4e7cbef4c36141167100c01c25867cf614 WatchSource:0}: Error finding container 9610203e39c71e2a1719b0bddf581c4e7cbef4c36141167100c01c25867cf614: Status 404 returned error can't find the container with id 9610203e39c71e2a1719b0bddf581c4e7cbef4c36141167100c01c25867cf614 Apr 16 16:05:34.817408 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:34.817325 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t9bw7" event={"ID":"2f1e4962-e3c5-4ee3-952d-c5105193db44","Type":"ContainerStarted","Data":"1623b6aa322a5b030928fefea27a37227f23d53df32e533cc67d721a0ac5307e"} Apr 16 16:05:34.818525 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:34.818488 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dlmbb" event={"ID":"59a52d95-8872-4fa3-b620-5f948a8a6e16","Type":"ContainerStarted","Data":"9610203e39c71e2a1719b0bddf581c4e7cbef4c36141167100c01c25867cf614"} Apr 16 16:05:36.825416 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:36.825377 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dlmbb" event={"ID":"59a52d95-8872-4fa3-b620-5f948a8a6e16","Type":"ContainerStarted","Data":"90f2879df1cac9f1df513821a0fb35156ffa0235724b838133c09d77680e3cb5"} Apr 16 16:05:36.826794 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:36.826771 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t9bw7" event={"ID":"2f1e4962-e3c5-4ee3-952d-c5105193db44","Type":"ContainerStarted","Data":"de0e007a5a0b7cc858431b029aaca325330c89edad952e92afe587da891704e9"} Apr 16 16:05:36.826914 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:36.826799 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t9bw7" event={"ID":"2f1e4962-e3c5-4ee3-952d-c5105193db44","Type":"ContainerStarted","Data":"83755bb69dc497fca105cbd3a87f4c4c9337d369862d03f8c9e925a817b924fe"} Apr 16 16:05:36.826914 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:36.826893 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-t9bw7" Apr 16 16:05:36.843930 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:36.843878 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dlmbb" podStartSLOduration=130.118656053 podStartE2EDuration="2m11.843863648s" podCreationTimestamp="2026-04-16 16:03:25 +0000 UTC" firstStartedPulling="2026-04-16 16:05:34.038072831 +0000 UTC m=+162.215133865" lastFinishedPulling="2026-04-16 16:05:35.763280413 +0000 UTC m=+163.940341460" observedRunningTime="2026-04-16 16:05:36.842905644 +0000 UTC m=+165.019966701" watchObservedRunningTime="2026-04-16 16:05:36.843863648 +0000 UTC m=+165.020924697" Apr 16 16:05:36.861761 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:36.861715 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t9bw7" podStartSLOduration=129.968993898 podStartE2EDuration="2m11.861702033s" podCreationTimestamp="2026-04-16 16:03:25 +0000 UTC" firstStartedPulling="2026-04-16 16:05:33.843718009 +0000 UTC m=+162.020779045" lastFinishedPulling="2026-04-16 16:05:35.736426134 +0000 UTC m=+163.913487180" observedRunningTime="2026-04-16 16:05:36.861166068 +0000 UTC m=+165.038227123" watchObservedRunningTime="2026-04-16 16:05:36.861702033 +0000 UTC m=+165.038763087" Apr 16 16:05:43.372179 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:43.372079 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:05:46.831643 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:46.831613 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t9bw7" Apr 16 16:05:48.586190 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.586159 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5gnbb"] Apr 16 16:05:48.588381 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.588360 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.591211 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.591182 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:05:48.591388 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.591363 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:05:48.592127 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.592109 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:05:48.592222 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.592137 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:05:48.592369 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.592230 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-p4mnm\"" Apr 16 16:05:48.622028 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.621997 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5gnbb"] Apr 16 16:05:48.713576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.713543 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/29fd374c-3a57-4945-9494-014bf8f71730-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.713726 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.713592 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5l6f\" (UniqueName: \"kubernetes.io/projected/29fd374c-3a57-4945-9494-014bf8f71730-kube-api-access-h5l6f\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.713726 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.713610 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/29fd374c-3a57-4945-9494-014bf8f71730-data-volume\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.713726 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.713627 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/29fd374c-3a57-4945-9494-014bf8f71730-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.713726 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.713712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/29fd374c-3a57-4945-9494-014bf8f71730-crio-socket\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.814223 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.814179 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/29fd374c-3a57-4945-9494-014bf8f71730-crio-socket\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.814223 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.814228 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/29fd374c-3a57-4945-9494-014bf8f71730-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.814451 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.814257 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5l6f\" (UniqueName: \"kubernetes.io/projected/29fd374c-3a57-4945-9494-014bf8f71730-kube-api-access-h5l6f\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.814451 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.814277 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/29fd374c-3a57-4945-9494-014bf8f71730-data-volume\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.814451 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.814292 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/29fd374c-3a57-4945-9494-014bf8f71730-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.814451 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.814292 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/29fd374c-3a57-4945-9494-014bf8f71730-crio-socket\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.814709 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.814689 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/29fd374c-3a57-4945-9494-014bf8f71730-data-volume\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.814797 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.814773 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/29fd374c-3a57-4945-9494-014bf8f71730-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.816626 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.816605 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/29fd374c-3a57-4945-9494-014bf8f71730-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.826907 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.826886 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5l6f\" (UniqueName: \"kubernetes.io/projected/29fd374c-3a57-4945-9494-014bf8f71730-kube-api-access-h5l6f\") pod \"insights-runtime-extractor-5gnbb\" (UID: \"29fd374c-3a57-4945-9494-014bf8f71730\") " pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:48.898906 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:48.898825 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5gnbb" Apr 16 16:05:49.020437 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:49.020406 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5gnbb"] Apr 16 16:05:49.023470 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:05:49.023440 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29fd374c_3a57_4945_9494_014bf8f71730.slice/crio-bdffe0f084f67e9fab0d8d5ae32263d1046e23bcde25db01c13b86a4ff23e81a WatchSource:0}: Error finding container bdffe0f084f67e9fab0d8d5ae32263d1046e23bcde25db01c13b86a4ff23e81a: Status 404 returned error can't find the container with id bdffe0f084f67e9fab0d8d5ae32263d1046e23bcde25db01c13b86a4ff23e81a Apr 16 16:05:49.861719 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:49.861684 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5gnbb" event={"ID":"29fd374c-3a57-4945-9494-014bf8f71730","Type":"ContainerStarted","Data":"cca371fee31bf094264fa096f1c3d600545c9280e9d18701eab1d9452a5797bd"} Apr 16 16:05:49.861719 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:49.861720 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5gnbb" event={"ID":"29fd374c-3a57-4945-9494-014bf8f71730","Type":"ContainerStarted","Data":"7d330060145b4d5e01ec4868a569a2756618e982064916c3d30accdf78a4eb16"} Apr 16 16:05:49.862387 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:49.861729 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5gnbb" event={"ID":"29fd374c-3a57-4945-9494-014bf8f71730","Type":"ContainerStarted","Data":"bdffe0f084f67e9fab0d8d5ae32263d1046e23bcde25db01c13b86a4ff23e81a"} Apr 16 16:05:51.868114 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:51.868074 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5gnbb" event={"ID":"29fd374c-3a57-4945-9494-014bf8f71730","Type":"ContainerStarted","Data":"bbb3bab181de55c106a73b3dfc2fdb01e06756e12f1666b4cbb813c5e422e780"} Apr 16 16:05:51.890209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:51.890161 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5gnbb" podStartSLOduration=2.062479424 podStartE2EDuration="3.890148426s" podCreationTimestamp="2026-04-16 16:05:48 +0000 UTC" firstStartedPulling="2026-04-16 16:05:49.072251278 +0000 UTC m=+177.249312312" lastFinishedPulling="2026-04-16 16:05:50.89992028 +0000 UTC m=+179.076981314" observedRunningTime="2026-04-16 16:05:51.889795616 +0000 UTC m=+180.066856670" watchObservedRunningTime="2026-04-16 16:05:51.890148426 +0000 UTC m=+180.067209481" Apr 16 16:05:52.977043 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:52.977007 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf"] Apr 16 16:05:52.979044 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:52.979028 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" Apr 16 16:05:52.982253 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:52.982224 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 16:05:52.982392 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:52.982293 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-vvx97\"" Apr 16 16:05:52.991537 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:52.991516 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf"] Apr 16 16:05:53.048301 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:53.048269 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e2452edd-6580-48d0-aa50-8d8eda8abc6f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-d27hf\" (UID: \"e2452edd-6580-48d0-aa50-8d8eda8abc6f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" Apr 16 16:05:53.149596 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:53.149556 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e2452edd-6580-48d0-aa50-8d8eda8abc6f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-d27hf\" (UID: \"e2452edd-6580-48d0-aa50-8d8eda8abc6f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" Apr 16 16:05:53.151809 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:53.151789 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e2452edd-6580-48d0-aa50-8d8eda8abc6f-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-d27hf\" (UID: \"e2452edd-6580-48d0-aa50-8d8eda8abc6f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" Apr 16 16:05:53.288214 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:53.288106 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" Apr 16 16:05:53.405046 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:53.405012 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf"] Apr 16 16:05:53.408794 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:05:53.408769 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2452edd_6580_48d0_aa50_8d8eda8abc6f.slice/crio-d91dac384fd24eea359f4517caa703e8ec4d73693e3ba3efc9429d624e8f443f WatchSource:0}: Error finding container d91dac384fd24eea359f4517caa703e8ec4d73693e3ba3efc9429d624e8f443f: Status 404 returned error can't find the container with id d91dac384fd24eea359f4517caa703e8ec4d73693e3ba3efc9429d624e8f443f Apr 16 16:05:53.873570 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:53.873536 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" event={"ID":"e2452edd-6580-48d0-aa50-8d8eda8abc6f","Type":"ContainerStarted","Data":"d91dac384fd24eea359f4517caa703e8ec4d73693e3ba3efc9429d624e8f443f"} Apr 16 16:05:54.877311 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:54.877270 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" event={"ID":"e2452edd-6580-48d0-aa50-8d8eda8abc6f","Type":"ContainerStarted","Data":"2880cfebfaa5bba733095908e97790c747a7169a1c51a24689a04145726b68fd"} Apr 16 16:05:54.877774 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:54.877469 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" Apr 16 16:05:54.881976 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:54.881952 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" Apr 16 16:05:54.893998 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:05:54.893954 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-d27hf" podStartSLOduration=1.865867056 podStartE2EDuration="2.893943398s" podCreationTimestamp="2026-04-16 16:05:52 +0000 UTC" firstStartedPulling="2026-04-16 16:05:53.41082756 +0000 UTC m=+181.587888604" lastFinishedPulling="2026-04-16 16:05:54.438903913 +0000 UTC m=+182.615964946" observedRunningTime="2026-04-16 16:05:54.892560201 +0000 UTC m=+183.069621266" watchObservedRunningTime="2026-04-16 16:05:54.893943398 +0000 UTC m=+183.071004452" Apr 16 16:06:01.456573 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.456541 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt"] Apr 16 16:06:01.458830 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.458810 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.462188 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.462162 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:06:01.462188 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.462162 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-t8tmq\"" Apr 16 16:06:01.462433 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.462163 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:06:01.462433 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.462163 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:06:01.462433 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.462173 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 16:06:01.462433 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.462175 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:06:01.469323 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.469302 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9wfjm"] Apr 16 16:06:01.471376 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.471362 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.476275 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.476255 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:06:01.476582 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.476270 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:06:01.476820 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.476789 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:06:01.477036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.476994 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-87tn6\"" Apr 16 16:06:01.482470 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.482444 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt"] Apr 16 16:06:01.611308 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611276 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-textfile\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.611308 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611313 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/792a7b05-8abe-4340-8ed9-d8a1811f25f6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.611528 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611351 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6e16886-bf74-4956-bc2c-ec8432af1f06-metrics-client-ca\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.611528 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611423 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.611528 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611488 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-accelerators-collector-config\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.611679 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611544 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-sys\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.611679 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611567 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/792a7b05-8abe-4340-8ed9-d8a1811f25f6-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.611679 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611588 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-root\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.611679 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611648 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-wtmp\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.611809 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611691 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/792a7b05-8abe-4340-8ed9-d8a1811f25f6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.611809 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611709 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97n6j\" (UniqueName: \"kubernetes.io/projected/a6e16886-bf74-4956-bc2c-ec8432af1f06-kube-api-access-97n6j\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.611809 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611726 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxr6z\" (UniqueName: \"kubernetes.io/projected/792a7b05-8abe-4340-8ed9-d8a1811f25f6-kube-api-access-cxr6z\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.611809 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.611743 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-tls\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712231 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712133 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-root\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712206 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-wtmp\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712286 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/792a7b05-8abe-4340-8ed9-d8a1811f25f6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.712365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97n6j\" (UniqueName: \"kubernetes.io/projected/a6e16886-bf74-4956-bc2c-ec8432af1f06-kube-api-access-97n6j\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712243 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-root\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712356 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-wtmp\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712367 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxr6z\" (UniqueName: \"kubernetes.io/projected/792a7b05-8abe-4340-8ed9-d8a1811f25f6-kube-api-access-cxr6z\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.712576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712404 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-tls\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712445 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-textfile\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712470 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/792a7b05-8abe-4340-8ed9-d8a1811f25f6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.712576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712496 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6e16886-bf74-4956-bc2c-ec8432af1f06-metrics-client-ca\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712538 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712576 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:06:01.712544 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:06:01.712576 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712564 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-accelerators-collector-config\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712957 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712599 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-sys\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712957 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:06:01.712632 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-tls podName:a6e16886-bf74-4956-bc2c-ec8432af1f06 nodeName:}" failed. No retries permitted until 2026-04-16 16:06:02.212609319 +0000 UTC m=+190.389670367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-tls") pod "node-exporter-9wfjm" (UID: "a6e16886-bf74-4956-bc2c-ec8432af1f06") : secret "node-exporter-tls" not found Apr 16 16:06:01.712957 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712638 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6e16886-bf74-4956-bc2c-ec8432af1f06-sys\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.712957 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.712686 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/792a7b05-8abe-4340-8ed9-d8a1811f25f6-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.713171 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.713128 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-textfile\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.713227 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.713189 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6e16886-bf74-4956-bc2c-ec8432af1f06-metrics-client-ca\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.713308 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.713285 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-accelerators-collector-config\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.713444 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.713424 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/792a7b05-8abe-4340-8ed9-d8a1811f25f6-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.714877 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.714858 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.715272 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.715250 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/792a7b05-8abe-4340-8ed9-d8a1811f25f6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.715377 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.715355 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/792a7b05-8abe-4340-8ed9-d8a1811f25f6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.725157 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.725122 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxr6z\" (UniqueName: \"kubernetes.io/projected/792a7b05-8abe-4340-8ed9-d8a1811f25f6-kube-api-access-cxr6z\") pod \"openshift-state-metrics-5669946b84-4f9bt\" (UID: \"792a7b05-8abe-4340-8ed9-d8a1811f25f6\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.726708 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.726687 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97n6j\" (UniqueName: \"kubernetes.io/projected/a6e16886-bf74-4956-bc2c-ec8432af1f06-kube-api-access-97n6j\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:01.767906 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.767873 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" Apr 16 16:06:01.924261 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:01.924224 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt"] Apr 16 16:06:01.928736 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:06:01.928703 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792a7b05_8abe_4340_8ed9_d8a1811f25f6.slice/crio-89612615e5eb0a6cdf0e02a4f6dac81bdefa938222b08b3ec8021f7282bc708c WatchSource:0}: Error finding container 89612615e5eb0a6cdf0e02a4f6dac81bdefa938222b08b3ec8021f7282bc708c: Status 404 returned error can't find the container with id 89612615e5eb0a6cdf0e02a4f6dac81bdefa938222b08b3ec8021f7282bc708c Apr 16 16:06:02.217916 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:02.217878 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-tls\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:02.220127 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:02.220081 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a6e16886-bf74-4956-bc2c-ec8432af1f06-node-exporter-tls\") pod \"node-exporter-9wfjm\" (UID: \"a6e16886-bf74-4956-bc2c-ec8432af1f06\") " pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:02.380444 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:02.380418 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9wfjm" Apr 16 16:06:02.388434 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:06:02.388413 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e16886_bf74_4956_bc2c_ec8432af1f06.slice/crio-bb94bb0ad7c25981f8fe19d91b67dfc90622b3744a6a84afdaf2880a060c76b0 WatchSource:0}: Error finding container bb94bb0ad7c25981f8fe19d91b67dfc90622b3744a6a84afdaf2880a060c76b0: Status 404 returned error can't find the container with id bb94bb0ad7c25981f8fe19d91b67dfc90622b3744a6a84afdaf2880a060c76b0 Apr 16 16:06:02.898418 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:02.898372 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9wfjm" event={"ID":"a6e16886-bf74-4956-bc2c-ec8432af1f06","Type":"ContainerStarted","Data":"bb94bb0ad7c25981f8fe19d91b67dfc90622b3744a6a84afdaf2880a060c76b0"} Apr 16 16:06:02.900309 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:02.900281 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" event={"ID":"792a7b05-8abe-4340-8ed9-d8a1811f25f6","Type":"ContainerStarted","Data":"1e101392120cf6347c4ad6f9ab88086c92eafb33bbb6003e02dcbdf63efa5f1e"} Apr 16 16:06:02.900452 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:02.900314 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" event={"ID":"792a7b05-8abe-4340-8ed9-d8a1811f25f6","Type":"ContainerStarted","Data":"ab9067bc1c5a0ae9464810c8060d6786031af354526db4c8cebc339ce5a16d92"} Apr 16 16:06:02.900452 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:02.900326 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" event={"ID":"792a7b05-8abe-4340-8ed9-d8a1811f25f6","Type":"ContainerStarted","Data":"89612615e5eb0a6cdf0e02a4f6dac81bdefa938222b08b3ec8021f7282bc708c"} Apr 16 16:06:03.904445 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:03.904413 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" event={"ID":"792a7b05-8abe-4340-8ed9-d8a1811f25f6","Type":"ContainerStarted","Data":"6f428033fd4e1fda2060ad28f4dd3fc7e274ac02f081406a37937d59c025200a"} Apr 16 16:06:03.905766 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:03.905739 2581 generic.go:358] "Generic (PLEG): container finished" podID="a6e16886-bf74-4956-bc2c-ec8432af1f06" containerID="eea9e54af6122061fba10b0e25b8fbd10bd77709ded472a128520cf3e0c801a7" exitCode=0 Apr 16 16:06:03.905887 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:03.905789 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9wfjm" event={"ID":"a6e16886-bf74-4956-bc2c-ec8432af1f06","Type":"ContainerDied","Data":"eea9e54af6122061fba10b0e25b8fbd10bd77709ded472a128520cf3e0c801a7"} Apr 16 16:06:03.925422 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:03.925379 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4f9bt" podStartSLOduration=1.8596579050000002 podStartE2EDuration="2.925359425s" podCreationTimestamp="2026-04-16 16:06:01 +0000 UTC" firstStartedPulling="2026-04-16 16:06:02.026799308 +0000 UTC m=+190.203860340" lastFinishedPulling="2026-04-16 16:06:03.092500827 +0000 UTC m=+191.269561860" observedRunningTime="2026-04-16 16:06:03.924382202 +0000 UTC m=+192.101443258" watchObservedRunningTime="2026-04-16 16:06:03.925359425 +0000 UTC m=+192.102420479" Apr 16 16:06:04.910689 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:04.910644 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9wfjm" event={"ID":"a6e16886-bf74-4956-bc2c-ec8432af1f06","Type":"ContainerStarted","Data":"21065fae2edeb0aef0f6e3ba0f0baa7e0dcd3808202e1d57dd2ce4428e1f0b92"} Apr 16 16:06:04.910689 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:04.910694 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9wfjm" event={"ID":"a6e16886-bf74-4956-bc2c-ec8432af1f06","Type":"ContainerStarted","Data":"e25f8bfd543caf41dd2c2bd332cf8c5f455dcda84fd1ddf5b62aff6278997143"} Apr 16 16:06:04.934582 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:04.934533 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9wfjm" podStartSLOduration=3.2308200830000002 podStartE2EDuration="3.934517951s" podCreationTimestamp="2026-04-16 16:06:01 +0000 UTC" firstStartedPulling="2026-04-16 16:06:02.390169822 +0000 UTC m=+190.567230869" lastFinishedPulling="2026-04-16 16:06:03.093867693 +0000 UTC m=+191.270928737" observedRunningTime="2026-04-16 16:06:04.933220731 +0000 UTC m=+193.110281798" watchObservedRunningTime="2026-04-16 16:06:04.934517951 +0000 UTC m=+193.111579006" Apr 16 16:06:05.906163 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.906129 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-857bc6d45b-7q7v8"] Apr 16 16:06:05.908139 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.908122 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:05.912928 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.912907 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-bre3mn94ne06e\"" Apr 16 16:06:05.913262 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.912912 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-blrbl\"" Apr 16 16:06:05.913262 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.912972 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 16:06:05.919737 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.919712 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 16:06:05.919897 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.919760 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:06:05.919897 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.919785 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 16:06:05.935418 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:05.935387 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-857bc6d45b-7q7v8"] Apr 16 16:06:06.050732 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.050698 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.050732 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.050731 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-client-ca-bundle\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.050943 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.050760 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-secret-metrics-server-tls\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.050943 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.050855 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-secret-metrics-server-client-certs\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.050943 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.050910 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-audit-log\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.051091 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.050958 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-metrics-server-audit-profiles\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.051091 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.051007 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4zrp\" (UniqueName: \"kubernetes.io/projected/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-kube-api-access-v4zrp\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.151785 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.151755 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-audit-log\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.151785 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.151794 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-metrics-server-audit-profiles\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.152023 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.151837 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4zrp\" (UniqueName: \"kubernetes.io/projected/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-kube-api-access-v4zrp\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.152023 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.151912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.152023 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.151938 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-client-ca-bundle\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.152023 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.151969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-secret-metrics-server-tls\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.152227 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.152027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-secret-metrics-server-client-certs\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.152295 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.152267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-audit-log\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.153116 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.153089 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.153283 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.153256 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-metrics-server-audit-profiles\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.154830 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.154803 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-client-ca-bundle\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.154961 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.154868 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-secret-metrics-server-client-certs\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.155010 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.154980 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-secret-metrics-server-tls\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.165918 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.165858 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4zrp\" (UniqueName: \"kubernetes.io/projected/58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5-kube-api-access-v4zrp\") pod \"metrics-server-857bc6d45b-7q7v8\" (UID: \"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5\") " pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.217603 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.217571 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:06.340136 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.340015 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-857bc6d45b-7q7v8"] Apr 16 16:06:06.342612 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:06:06.342582 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d82fc7_9e0f_4ce8_b1b0_e5d95ef105c5.slice/crio-7bc5ca2c9cec3c74f25fac335fec7ee98f9194478a50c12b16da1dccc89de890 WatchSource:0}: Error finding container 7bc5ca2c9cec3c74f25fac335fec7ee98f9194478a50c12b16da1dccc89de890: Status 404 returned error can't find the container with id 7bc5ca2c9cec3c74f25fac335fec7ee98f9194478a50c12b16da1dccc89de890 Apr 16 16:06:06.916624 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:06.916583 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" event={"ID":"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5","Type":"ContainerStarted","Data":"7bc5ca2c9cec3c74f25fac335fec7ee98f9194478a50c12b16da1dccc89de890"} Apr 16 16:06:07.921169 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:07.921129 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" event={"ID":"58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5","Type":"ContainerStarted","Data":"2e80367399fffa83a9c3e38a0fe97e9c4c355d3eae52cd36bb76a13a08f0258a"} Apr 16 16:06:07.944129 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:07.944075 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" podStartSLOduration=1.794843398 podStartE2EDuration="2.944058076s" podCreationTimestamp="2026-04-16 16:06:05 +0000 UTC" firstStartedPulling="2026-04-16 16:06:06.344622279 +0000 UTC m=+194.521683312" lastFinishedPulling="2026-04-16 16:06:07.493836948 +0000 UTC m=+195.670897990" observedRunningTime="2026-04-16 16:06:07.943017135 +0000 UTC m=+196.120078184" watchObservedRunningTime="2026-04-16 16:06:07.944058076 +0000 UTC m=+196.121119132" Apr 16 16:06:26.217793 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:26.217753 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:26.217793 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:26.217798 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:31.667754 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.667720 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bc88bd9c-k4zqf"] Apr 16 16:06:31.670525 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.670509 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.674747 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.674724 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:06:31.674747 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.674732 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:06:31.674963 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.674735 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:06:31.674963 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.674779 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:06:31.675115 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.675091 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:06:31.675638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.675619 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:06:31.675753 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.675711 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-r88v9\"" Apr 16 16:06:31.675753 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.675732 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:06:31.681204 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.681184 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:06:31.686399 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.686377 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bc88bd9c-k4zqf"] Apr 16 16:06:31.752484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.752455 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-service-ca\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.752484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.752489 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-trusted-ca-bundle\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.752698 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.752535 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmvg\" (UniqueName: \"kubernetes.io/projected/d62edf94-770c-4343-841b-3ce0c6923074-kube-api-access-wvmvg\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.752698 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.752555 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-oauth-serving-cert\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.752698 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.752585 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-serving-cert\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.752698 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.752607 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-oauth-config\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.752698 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.752644 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-console-config\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.853308 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.853272 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-serving-cert\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.853308 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.853313 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-oauth-config\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.853531 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.853328 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-console-config\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.853531 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.853376 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-service-ca\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.853531 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.853394 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-trusted-ca-bundle\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.853531 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.853445 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmvg\" (UniqueName: \"kubernetes.io/projected/d62edf94-770c-4343-841b-3ce0c6923074-kube-api-access-wvmvg\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.853531 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.853473 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-oauth-serving-cert\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.854236 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.854206 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-service-ca\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.854426 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.854206 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-oauth-serving-cert\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.854496 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.854441 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-trusted-ca-bundle\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.854584 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.854565 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-console-config\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.855834 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.855807 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-serving-cert\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.855936 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.855921 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-oauth-config\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.862163 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.862141 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmvg\" (UniqueName: \"kubernetes.io/projected/d62edf94-770c-4343-841b-3ce0c6923074-kube-api-access-wvmvg\") pod \"console-7bc88bd9c-k4zqf\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:31.980060 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:31.979973 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:32.126141 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:32.126106 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bc88bd9c-k4zqf"] Apr 16 16:06:32.128712 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:06:32.128685 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd62edf94_770c_4343_841b_3ce0c6923074.slice/crio-95c03941d09bea5b7363646758d76ba682a1ce717a725e340d2a9cd69c5245ad WatchSource:0}: Error finding container 95c03941d09bea5b7363646758d76ba682a1ce717a725e340d2a9cd69c5245ad: Status 404 returned error can't find the container with id 95c03941d09bea5b7363646758d76ba682a1ce717a725e340d2a9cd69c5245ad Apr 16 16:06:32.989696 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:32.989658 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc88bd9c-k4zqf" event={"ID":"d62edf94-770c-4343-841b-3ce0c6923074","Type":"ContainerStarted","Data":"95c03941d09bea5b7363646758d76ba682a1ce717a725e340d2a9cd69c5245ad"} Apr 16 16:06:34.349320 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:34.349260 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" podUID="c30b47dc-abdc-404e-b585-3f77ecec1e5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:06:34.996893 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:34.996801 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc88bd9c-k4zqf" event={"ID":"d62edf94-770c-4343-841b-3ce0c6923074","Type":"ContainerStarted","Data":"88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1"} Apr 16 16:06:35.036309 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:35.036259 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bc88bd9c-k4zqf" podStartSLOduration=1.552755374 podStartE2EDuration="4.036244551s" podCreationTimestamp="2026-04-16 16:06:31 +0000 UTC" firstStartedPulling="2026-04-16 16:06:32.130368579 +0000 UTC m=+220.307429613" lastFinishedPulling="2026-04-16 16:06:34.613857757 +0000 UTC m=+222.790918790" observedRunningTime="2026-04-16 16:06:35.035500605 +0000 UTC m=+223.212561659" watchObservedRunningTime="2026-04-16 16:06:35.036244551 +0000 UTC m=+223.213305607" Apr 16 16:06:41.981032 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:41.980992 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:41.981420 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:41.981044 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:41.985633 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:41.985612 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:42.018011 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:42.017986 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:06:44.350045 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:44.350008 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" podUID="c30b47dc-abdc-404e-b585-3f77ecec1e5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:06:46.223646 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:46.223619 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:46.227417 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:46.227389 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-857bc6d45b-7q7v8" Apr 16 16:06:54.349157 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:54.349119 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" podUID="c30b47dc-abdc-404e-b585-3f77ecec1e5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 16:06:54.349549 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:54.349197 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" Apr 16 16:06:54.349707 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:54.349676 2581 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"db8dc57c871cb2fd47db044e0e285c038f5329822f5c0e0ccb510be6d7eddb00"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 16:06:54.349753 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:54.349739 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" podUID="c30b47dc-abdc-404e-b585-3f77ecec1e5e" containerName="service-proxy" containerID="cri-o://db8dc57c871cb2fd47db044e0e285c038f5329822f5c0e0ccb510be6d7eddb00" gracePeriod=30 Apr 16 16:06:55.053940 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:55.053906 2581 generic.go:358] "Generic (PLEG): container finished" podID="c30b47dc-abdc-404e-b585-3f77ecec1e5e" containerID="db8dc57c871cb2fd47db044e0e285c038f5329822f5c0e0ccb510be6d7eddb00" exitCode=2 Apr 16 16:06:55.053940 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:55.053949 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" event={"ID":"c30b47dc-abdc-404e-b585-3f77ecec1e5e","Type":"ContainerDied","Data":"db8dc57c871cb2fd47db044e0e285c038f5329822f5c0e0ccb510be6d7eddb00"} Apr 16 16:06:55.054207 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:06:55.053977 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-548df49cd9-td9t5" event={"ID":"c30b47dc-abdc-404e-b585-3f77ecec1e5e","Type":"ContainerStarted","Data":"663efcdb016cbadd319b762374192a7dd1fe23ce199d1fd9b7500f84b0df2b2c"} Apr 16 16:07:04.219600 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:04.218978 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:07:04.222983 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:04.222917 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0aa611e2-18d8-4712-9938-e8c21daeb1a0-metrics-certs\") pod \"network-metrics-daemon-rgfkx\" (UID: \"0aa611e2-18d8-4712-9938-e8c21daeb1a0\") " pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:07:04.375928 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:04.375892 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vp75c\"" Apr 16 16:07:04.384039 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:04.384000 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rgfkx" Apr 16 16:07:04.534557 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:04.534520 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rgfkx"] Apr 16 16:07:04.538126 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:07:04.538098 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa611e2_18d8_4712_9938_e8c21daeb1a0.slice/crio-affec999e79aaf7798f413dd7498c0f15bc9a5bbdb63fa98e2abebba208cb51a WatchSource:0}: Error finding container affec999e79aaf7798f413dd7498c0f15bc9a5bbdb63fa98e2abebba208cb51a: Status 404 returned error can't find the container with id affec999e79aaf7798f413dd7498c0f15bc9a5bbdb63fa98e2abebba208cb51a Apr 16 16:07:05.081706 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:05.081642 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rgfkx" event={"ID":"0aa611e2-18d8-4712-9938-e8c21daeb1a0","Type":"ContainerStarted","Data":"affec999e79aaf7798f413dd7498c0f15bc9a5bbdb63fa98e2abebba208cb51a"} Apr 16 16:07:07.090350 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:07.090301 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rgfkx" event={"ID":"0aa611e2-18d8-4712-9938-e8c21daeb1a0","Type":"ContainerStarted","Data":"fbee332c54347e76e35724ce4e560c41070d30b43598f0bb92367d5e97eff354"} Apr 16 16:07:07.090350 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:07.090353 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rgfkx" event={"ID":"0aa611e2-18d8-4712-9938-e8c21daeb1a0","Type":"ContainerStarted","Data":"0f4d0b82d32b9db1c814d68360dbba9c520ea9978da9d42a8491dedbb65a20b2"} Apr 16 16:07:07.109358 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:07.109288 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rgfkx" podStartSLOduration=253.198037481 podStartE2EDuration="4m15.109272465s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:07:04.539892119 +0000 UTC m=+252.716953152" lastFinishedPulling="2026-04-16 16:07:06.451127099 +0000 UTC m=+254.628188136" observedRunningTime="2026-04-16 16:07:07.107615922 +0000 UTC m=+255.284676987" watchObservedRunningTime="2026-04-16 16:07:07.109272465 +0000 UTC m=+255.286333520" Apr 16 16:07:39.917897 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:39.917866 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bc88bd9c-k4zqf"] Apr 16 16:07:52.244717 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:07:52.244684 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:08:04.936242 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:04.936202 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7bc88bd9c-k4zqf" podUID="d62edf94-770c-4343-841b-3ce0c6923074" containerName="console" containerID="cri-o://88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1" gracePeriod=15 Apr 16 16:08:05.171754 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.171732 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bc88bd9c-k4zqf_d62edf94-770c-4343-841b-3ce0c6923074/console/0.log" Apr 16 16:08:05.171873 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.171793 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:08:05.245479 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245393 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvmvg\" (UniqueName: \"kubernetes.io/projected/d62edf94-770c-4343-841b-3ce0c6923074-kube-api-access-wvmvg\") pod \"d62edf94-770c-4343-841b-3ce0c6923074\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " Apr 16 16:08:05.245479 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245421 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-serving-cert\") pod \"d62edf94-770c-4343-841b-3ce0c6923074\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " Apr 16 16:08:05.245479 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245461 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-console-config\") pod \"d62edf94-770c-4343-841b-3ce0c6923074\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " Apr 16 16:08:05.245479 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245480 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-oauth-config\") pod \"d62edf94-770c-4343-841b-3ce0c6923074\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " Apr 16 16:08:05.245791 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245505 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-trusted-ca-bundle\") pod \"d62edf94-770c-4343-841b-3ce0c6923074\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " Apr 16 16:08:05.245791 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245535 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-service-ca\") pod \"d62edf94-770c-4343-841b-3ce0c6923074\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " Apr 16 16:08:05.245791 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245557 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-oauth-serving-cert\") pod \"d62edf94-770c-4343-841b-3ce0c6923074\" (UID: \"d62edf94-770c-4343-841b-3ce0c6923074\") " Apr 16 16:08:05.245986 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245956 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-console-config" (OuterVolumeSpecName: "console-config") pod "d62edf94-770c-4343-841b-3ce0c6923074" (UID: "d62edf94-770c-4343-841b-3ce0c6923074"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:08:05.245986 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245969 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d62edf94-770c-4343-841b-3ce0c6923074" (UID: "d62edf94-770c-4343-841b-3ce0c6923074"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:08:05.246087 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.245982 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d62edf94-770c-4343-841b-3ce0c6923074" (UID: "d62edf94-770c-4343-841b-3ce0c6923074"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:08:05.246087 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.246010 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-service-ca" (OuterVolumeSpecName: "service-ca") pod "d62edf94-770c-4343-841b-3ce0c6923074" (UID: "d62edf94-770c-4343-841b-3ce0c6923074"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:08:05.247828 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.247798 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62edf94-770c-4343-841b-3ce0c6923074-kube-api-access-wvmvg" (OuterVolumeSpecName: "kube-api-access-wvmvg") pod "d62edf94-770c-4343-841b-3ce0c6923074" (UID: "d62edf94-770c-4343-841b-3ce0c6923074"). InnerVolumeSpecName "kube-api-access-wvmvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:08:05.247936 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.247841 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d62edf94-770c-4343-841b-3ce0c6923074" (UID: "d62edf94-770c-4343-841b-3ce0c6923074"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:08:05.247936 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.247867 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d62edf94-770c-4343-841b-3ce0c6923074" (UID: "d62edf94-770c-4343-841b-3ce0c6923074"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:08:05.253699 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.253683 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bc88bd9c-k4zqf_d62edf94-770c-4343-841b-3ce0c6923074/console/0.log" Apr 16 16:08:05.253788 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.253718 2581 generic.go:358] "Generic (PLEG): container finished" podID="d62edf94-770c-4343-841b-3ce0c6923074" containerID="88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1" exitCode=2 Apr 16 16:08:05.253788 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.253777 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc88bd9c-k4zqf" event={"ID":"d62edf94-770c-4343-841b-3ce0c6923074","Type":"ContainerDied","Data":"88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1"} Apr 16 16:08:05.253788 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.253780 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc88bd9c-k4zqf" Apr 16 16:08:05.253918 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.253801 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc88bd9c-k4zqf" event={"ID":"d62edf94-770c-4343-841b-3ce0c6923074","Type":"ContainerDied","Data":"95c03941d09bea5b7363646758d76ba682a1ce717a725e340d2a9cd69c5245ad"} Apr 16 16:08:05.253918 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.253816 2581 scope.go:117] "RemoveContainer" containerID="88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1" Apr 16 16:08:05.261659 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.261636 2581 scope.go:117] "RemoveContainer" containerID="88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1" Apr 16 16:08:05.262041 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:08:05.262019 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1\": container with ID starting with 88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1 not found: ID does not exist" containerID="88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1" Apr 16 16:08:05.262127 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.262049 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1"} err="failed to get container status \"88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1\": rpc error: code = NotFound desc = could not find container \"88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1\": container with ID starting with 88e56dbcab6901cb973eadee5841a129b884f4b3db65868c546ef664183dfbc1 not found: ID does not exist" Apr 16 16:08:05.273265 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.273237 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bc88bd9c-k4zqf"] Apr 16 16:08:05.277383 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.277359 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bc88bd9c-k4zqf"] Apr 16 16:08:05.346165 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.346129 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-service-ca\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:08:05.346165 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.346160 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-oauth-serving-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:08:05.346165 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.346170 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvmvg\" (UniqueName: \"kubernetes.io/projected/d62edf94-770c-4343-841b-3ce0c6923074-kube-api-access-wvmvg\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:08:05.346427 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.346181 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-serving-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:08:05.346427 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.346190 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-console-config\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:08:05.346427 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.346199 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d62edf94-770c-4343-841b-3ce0c6923074-console-oauth-config\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:08:05.346427 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:05.346207 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d62edf94-770c-4343-841b-3ce0c6923074-trusted-ca-bundle\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:08:06.378265 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:08:06.378232 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62edf94-770c-4343-841b-3ce0c6923074" path="/var/lib/kubelet/pods/d62edf94-770c-4343-841b-3ce0c6923074/volumes" Apr 16 16:10:16.197401 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.197309 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6"] Apr 16 16:10:16.197886 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.197646 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d62edf94-770c-4343-841b-3ce0c6923074" containerName="console" Apr 16 16:10:16.197886 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.197661 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62edf94-770c-4343-841b-3ce0c6923074" containerName="console" Apr 16 16:10:16.197886 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.197701 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d62edf94-770c-4343-841b-3ce0c6923074" containerName="console" Apr 16 16:10:16.199464 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.199447 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:16.201930 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.201907 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-l4w5c\"" Apr 16 16:10:16.202045 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.201994 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 16:10:16.202778 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.202765 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:10:16.204626 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.204609 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:10:16.215685 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.215663 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6"] Apr 16 16:10:16.328380 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.328314 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmdsr\" (UniqueName: \"kubernetes.io/projected/f2e72353-b4fa-4515-bdf9-3db73204813d-kube-api-access-hmdsr\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6\" (UID: \"f2e72353-b4fa-4515-bdf9-3db73204813d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:16.328380 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.328386 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f2e72353-b4fa-4515-bdf9-3db73204813d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6\" (UID: \"f2e72353-b4fa-4515-bdf9-3db73204813d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:16.428776 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.428737 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmdsr\" (UniqueName: \"kubernetes.io/projected/f2e72353-b4fa-4515-bdf9-3db73204813d-kube-api-access-hmdsr\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6\" (UID: \"f2e72353-b4fa-4515-bdf9-3db73204813d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:16.428776 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.428787 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f2e72353-b4fa-4515-bdf9-3db73204813d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6\" (UID: \"f2e72353-b4fa-4515-bdf9-3db73204813d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:16.431109 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.431079 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f2e72353-b4fa-4515-bdf9-3db73204813d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6\" (UID: \"f2e72353-b4fa-4515-bdf9-3db73204813d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:16.444072 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.444048 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmdsr\" (UniqueName: \"kubernetes.io/projected/f2e72353-b4fa-4515-bdf9-3db73204813d-kube-api-access-hmdsr\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6\" (UID: \"f2e72353-b4fa-4515-bdf9-3db73204813d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:16.509728 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.509624 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:16.632797 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.632763 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6"] Apr 16 16:10:16.637362 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:10:16.637320 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e72353_b4fa_4515_bdf9_3db73204813d.slice/crio-865d85c6ca7d204fe99bdb616551f34aabcb047b61750b6fb78990ade28aa517 WatchSource:0}: Error finding container 865d85c6ca7d204fe99bdb616551f34aabcb047b61750b6fb78990ade28aa517: Status 404 returned error can't find the container with id 865d85c6ca7d204fe99bdb616551f34aabcb047b61750b6fb78990ade28aa517 Apr 16 16:10:16.639519 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:16.639503 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:10:17.593434 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:17.593393 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" event={"ID":"f2e72353-b4fa-4515-bdf9-3db73204813d","Type":"ContainerStarted","Data":"865d85c6ca7d204fe99bdb616551f34aabcb047b61750b6fb78990ade28aa517"} Apr 16 16:10:20.603965 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:20.603868 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" event={"ID":"f2e72353-b4fa-4515-bdf9-3db73204813d","Type":"ContainerStarted","Data":"5d58a96f94f616921b355555890ae9e53653c36f94f5b8872f6bcb99d40daad5"} Apr 16 16:10:20.603965 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:20.603918 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:20.623652 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:20.623592 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" podStartSLOduration=0.967195038 podStartE2EDuration="4.62357345s" podCreationTimestamp="2026-04-16 16:10:16 +0000 UTC" firstStartedPulling="2026-04-16 16:10:16.639627438 +0000 UTC m=+444.816688471" lastFinishedPulling="2026-04-16 16:10:20.29600585 +0000 UTC m=+448.473066883" observedRunningTime="2026-04-16 16:10:20.621963263 +0000 UTC m=+448.799024329" watchObservedRunningTime="2026-04-16 16:10:20.62357345 +0000 UTC m=+448.800634506" Apr 16 16:10:21.131398 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.129678 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4"] Apr 16 16:10:21.133051 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.133022 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.135532 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.135510 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 16:10:21.135674 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.135513 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rwkrh\"" Apr 16 16:10:21.135674 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.135514 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:10:21.141836 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.141807 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4"] Apr 16 16:10:21.268574 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.268535 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd4bh\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-kube-api-access-bd4bh\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.268745 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.268577 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.268745 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.268695 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.370000 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.369972 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.370170 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.370014 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd4bh\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-kube-api-access-bd4bh\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.370170 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.370035 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.370170 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:21.370121 2581 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:10:21.370170 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:21.370140 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:10:21.370170 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:21.370160 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4: references non-existent secret key: tls.crt Apr 16 16:10:21.370332 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:21.370214 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates podName:a1a5652a-a051-48b1-9c7e-27150ad3e2b1 nodeName:}" failed. No retries permitted until 2026-04-16 16:10:21.87019543 +0000 UTC m=+450.047256472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates") pod "keda-metrics-apiserver-7c9f485588-5xdz4" (UID: "a1a5652a-a051-48b1-9c7e-27150ad3e2b1") : references non-existent secret key: tls.crt Apr 16 16:10:21.370416 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.370401 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.390710 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.390630 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd4bh\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-kube-api-access-bd4bh\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.874311 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:21.874265 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:21.874686 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:21.874418 2581 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:10:21.874686 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:21.874441 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:10:21.874686 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:21.874462 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4: references non-existent secret key: tls.crt Apr 16 16:10:21.874686 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:21.874528 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates podName:a1a5652a-a051-48b1-9c7e-27150ad3e2b1 nodeName:}" failed. No retries permitted until 2026-04-16 16:10:22.874513259 +0000 UTC m=+451.051574292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates") pod "keda-metrics-apiserver-7c9f485588-5xdz4" (UID: "a1a5652a-a051-48b1-9c7e-27150ad3e2b1") : references non-existent secret key: tls.crt Apr 16 16:10:22.882428 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:22.882390 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:22.882802 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:22.882532 2581 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:10:22.882802 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:22.882549 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:10:22.882802 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:22.882566 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4: references non-existent secret key: tls.crt Apr 16 16:10:22.882802 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:22.882627 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates podName:a1a5652a-a051-48b1-9c7e-27150ad3e2b1 nodeName:}" failed. No retries permitted until 2026-04-16 16:10:24.882610278 +0000 UTC m=+453.059671317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates") pod "keda-metrics-apiserver-7c9f485588-5xdz4" (UID: "a1a5652a-a051-48b1-9c7e-27150ad3e2b1") : references non-existent secret key: tls.crt Apr 16 16:10:24.901153 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:24.901115 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:24.901688 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:24.901227 2581 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:10:24.901688 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:24.901239 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:10:24.901688 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:24.901256 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4: references non-existent secret key: tls.crt Apr 16 16:10:24.901688 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:10:24.901307 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates podName:a1a5652a-a051-48b1-9c7e-27150ad3e2b1 nodeName:}" failed. No retries permitted until 2026-04-16 16:10:28.901290089 +0000 UTC m=+457.078351125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates") pod "keda-metrics-apiserver-7c9f485588-5xdz4" (UID: "a1a5652a-a051-48b1-9c7e-27150ad3e2b1") : references non-existent secret key: tls.crt Apr 16 16:10:28.936120 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:28.936080 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:28.938483 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:28.938452 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1a5652a-a051-48b1-9c7e-27150ad3e2b1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5xdz4\" (UID: \"a1a5652a-a051-48b1-9c7e-27150ad3e2b1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:28.944245 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:28.944218 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:29.063140 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:29.063006 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4"] Apr 16 16:10:29.065775 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:10:29.065747 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a5652a_a051_48b1_9c7e_27150ad3e2b1.slice/crio-b0a4b4f61dfe8ce11b1e44c92981402aa8114dad6652ed481e3ebb15a33a3a05 WatchSource:0}: Error finding container b0a4b4f61dfe8ce11b1e44c92981402aa8114dad6652ed481e3ebb15a33a3a05: Status 404 returned error can't find the container with id b0a4b4f61dfe8ce11b1e44c92981402aa8114dad6652ed481e3ebb15a33a3a05 Apr 16 16:10:29.629141 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:29.629102 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" event={"ID":"a1a5652a-a051-48b1-9c7e-27150ad3e2b1","Type":"ContainerStarted","Data":"b0a4b4f61dfe8ce11b1e44c92981402aa8114dad6652ed481e3ebb15a33a3a05"} Apr 16 16:10:31.636231 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:31.636192 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" event={"ID":"a1a5652a-a051-48b1-9c7e-27150ad3e2b1","Type":"ContainerStarted","Data":"1f805b9f8979352b8d3789e6c490c808cca29cd1643eb2838805f975b26d4a23"} Apr 16 16:10:31.636607 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:31.636316 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:10:31.652747 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:31.652699 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" podStartSLOduration=8.297663455 podStartE2EDuration="10.652686005s" podCreationTimestamp="2026-04-16 16:10:21 +0000 UTC" firstStartedPulling="2026-04-16 16:10:29.066938991 +0000 UTC m=+457.244000025" lastFinishedPulling="2026-04-16 16:10:31.421961524 +0000 UTC m=+459.599022575" observedRunningTime="2026-04-16 16:10:31.652258769 +0000 UTC m=+459.829319824" watchObservedRunningTime="2026-04-16 16:10:31.652686005 +0000 UTC m=+459.829747061" Apr 16 16:10:41.609766 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:41.609736 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6k4r6" Apr 16 16:10:42.642938 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:10:42.642910 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5xdz4" Apr 16 16:11:27.591892 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.591858 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-cxcmv"] Apr 16 16:11:27.593948 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.593929 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:27.596632 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.596610 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-mdwm6\"" Apr 16 16:11:27.597178 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.597163 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 16:11:27.597247 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.597192 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:11:27.597247 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.597212 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:11:27.603815 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.603791 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-cxcmv"] Apr 16 16:11:27.630248 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.630219 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-85qhn"] Apr 16 16:11:27.632301 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.632282 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:27.635080 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.635064 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-smnt5\"" Apr 16 16:11:27.635183 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.635067 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:11:27.647717 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.647685 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-85qhn"] Apr 16 16:11:27.649195 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.649144 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert\") pod \"kserve-controller-manager-7c68cb4fc8-cxcmv\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:27.649445 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.649420 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnsql\" (UniqueName: \"kubernetes.io/projected/1fa173dd-2ccd-4207-8239-80027a30a8f7-kube-api-access-rnsql\") pod \"kserve-controller-manager-7c68cb4fc8-cxcmv\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:27.750284 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.750240 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf-data\") pod \"seaweedfs-86cc847c5c-85qhn\" (UID: \"c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf\") " pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:27.750284 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.750289 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnsql\" (UniqueName: \"kubernetes.io/projected/1fa173dd-2ccd-4207-8239-80027a30a8f7-kube-api-access-rnsql\") pod \"kserve-controller-manager-7c68cb4fc8-cxcmv\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:27.750521 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.750392 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8pzs\" (UniqueName: \"kubernetes.io/projected/c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf-kube-api-access-n8pzs\") pod \"seaweedfs-86cc847c5c-85qhn\" (UID: \"c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf\") " pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:27.750521 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.750442 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert\") pod \"kserve-controller-manager-7c68cb4fc8-cxcmv\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:27.750611 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:11:27.750533 2581 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 16:11:27.750611 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:11:27.750590 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert podName:1fa173dd-2ccd-4207-8239-80027a30a8f7 nodeName:}" failed. No retries permitted until 2026-04-16 16:11:28.250570454 +0000 UTC m=+516.427631487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert") pod "kserve-controller-manager-7c68cb4fc8-cxcmv" (UID: "1fa173dd-2ccd-4207-8239-80027a30a8f7") : secret "kserve-webhook-server-cert" not found Apr 16 16:11:27.759469 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.759446 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnsql\" (UniqueName: \"kubernetes.io/projected/1fa173dd-2ccd-4207-8239-80027a30a8f7-kube-api-access-rnsql\") pod \"kserve-controller-manager-7c68cb4fc8-cxcmv\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:27.851094 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.850992 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf-data\") pod \"seaweedfs-86cc847c5c-85qhn\" (UID: \"c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf\") " pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:27.851094 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.851049 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8pzs\" (UniqueName: \"kubernetes.io/projected/c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf-kube-api-access-n8pzs\") pod \"seaweedfs-86cc847c5c-85qhn\" (UID: \"c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf\") " pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:27.851460 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.851439 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf-data\") pod \"seaweedfs-86cc847c5c-85qhn\" (UID: \"c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf\") " pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:27.858161 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.858136 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8pzs\" (UniqueName: \"kubernetes.io/projected/c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf-kube-api-access-n8pzs\") pod \"seaweedfs-86cc847c5c-85qhn\" (UID: \"c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf\") " pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:27.940812 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:27.940770 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:28.079938 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:28.079828 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-85qhn"] Apr 16 16:11:28.082723 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:11:28.082695 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ec5fcc_e19c_4b4f_9e6a_205f0f34d2bf.slice/crio-2e52ef4c1013138977b7c2fc1b478a1c26f39ef35a86d57817771f7a74a93574 WatchSource:0}: Error finding container 2e52ef4c1013138977b7c2fc1b478a1c26f39ef35a86d57817771f7a74a93574: Status 404 returned error can't find the container with id 2e52ef4c1013138977b7c2fc1b478a1c26f39ef35a86d57817771f7a74a93574 Apr 16 16:11:28.254358 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:28.254239 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert\") pod \"kserve-controller-manager-7c68cb4fc8-cxcmv\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:28.256640 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:28.256614 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert\") pod \"kserve-controller-manager-7c68cb4fc8-cxcmv\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:28.504188 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:28.504149 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:28.638718 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:28.638682 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-cxcmv"] Apr 16 16:11:28.666722 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:11:28.666687 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa173dd_2ccd_4207_8239_80027a30a8f7.slice/crio-ac7684a0bc20a73056f4627b6036b6679421a68aa67eae6d4de1be747c4cd6d1 WatchSource:0}: Error finding container ac7684a0bc20a73056f4627b6036b6679421a68aa67eae6d4de1be747c4cd6d1: Status 404 returned error can't find the container with id ac7684a0bc20a73056f4627b6036b6679421a68aa67eae6d4de1be747c4cd6d1 Apr 16 16:11:28.793300 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:28.793238 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" event={"ID":"1fa173dd-2ccd-4207-8239-80027a30a8f7","Type":"ContainerStarted","Data":"ac7684a0bc20a73056f4627b6036b6679421a68aa67eae6d4de1be747c4cd6d1"} Apr 16 16:11:28.794559 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:28.794528 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-85qhn" event={"ID":"c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf","Type":"ContainerStarted","Data":"2e52ef4c1013138977b7c2fc1b478a1c26f39ef35a86d57817771f7a74a93574"} Apr 16 16:11:32.810162 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:32.810124 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" event={"ID":"1fa173dd-2ccd-4207-8239-80027a30a8f7","Type":"ContainerStarted","Data":"75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c"} Apr 16 16:11:32.810737 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:32.810229 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:11:32.811467 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:32.811446 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-85qhn" event={"ID":"c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf","Type":"ContainerStarted","Data":"3449d1647f470eea75f632168e500a2e9ecb39279b41b9c39b50740af2b10b83"} Apr 16 16:11:32.811579 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:32.811567 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:11:32.839506 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:32.839452 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-85qhn" podStartSLOduration=1.8939705249999998 podStartE2EDuration="5.839435353s" podCreationTimestamp="2026-04-16 16:11:27 +0000 UTC" firstStartedPulling="2026-04-16 16:11:28.083913308 +0000 UTC m=+516.260974341" lastFinishedPulling="2026-04-16 16:11:32.029378124 +0000 UTC m=+520.206439169" observedRunningTime="2026-04-16 16:11:32.83903659 +0000 UTC m=+521.016097643" watchObservedRunningTime="2026-04-16 16:11:32.839435353 +0000 UTC m=+521.016496408" Apr 16 16:11:32.839766 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:32.839743 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" podStartSLOduration=2.553152985 podStartE2EDuration="5.839737915s" podCreationTimestamp="2026-04-16 16:11:27 +0000 UTC" firstStartedPulling="2026-04-16 16:11:28.668563906 +0000 UTC m=+516.845624939" lastFinishedPulling="2026-04-16 16:11:31.955148833 +0000 UTC m=+520.132209869" observedRunningTime="2026-04-16 16:11:32.825261478 +0000 UTC m=+521.002322533" watchObservedRunningTime="2026-04-16 16:11:32.839737915 +0000 UTC m=+521.016799027" Apr 16 16:11:38.816711 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:11:38.816680 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-85qhn" Apr 16 16:12:02.783389 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:02.783283 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-cxcmv"] Apr 16 16:12:02.783861 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:02.783622 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" podUID="1fa173dd-2ccd-4207-8239-80027a30a8f7" containerName="manager" containerID="cri-o://75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c" gracePeriod=10 Apr 16 16:12:02.788580 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:02.788557 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:12:02.807609 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:02.807581 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-xnxw9"] Apr 16 16:12:02.809555 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:02.809541 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:02.819649 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:02.819626 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-xnxw9"] Apr 16 16:12:02.926424 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:02.926370 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glj7\" (UniqueName: \"kubernetes.io/projected/81ac8e20-f6b4-437d-a5da-95ea69a24a97-kube-api-access-9glj7\") pod \"kserve-controller-manager-7c68cb4fc8-xnxw9\" (UID: \"81ac8e20-f6b4-437d-a5da-95ea69a24a97\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:02.926611 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:02.926452 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ac8e20-f6b4-437d-a5da-95ea69a24a97-cert\") pod \"kserve-controller-manager-7c68cb4fc8-xnxw9\" (UID: \"81ac8e20-f6b4-437d-a5da-95ea69a24a97\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:03.026513 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.026489 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:12:03.026813 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.026792 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9glj7\" (UniqueName: \"kubernetes.io/projected/81ac8e20-f6b4-437d-a5da-95ea69a24a97-kube-api-access-9glj7\") pod \"kserve-controller-manager-7c68cb4fc8-xnxw9\" (UID: \"81ac8e20-f6b4-437d-a5da-95ea69a24a97\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:03.026902 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.026832 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ac8e20-f6b4-437d-a5da-95ea69a24a97-cert\") pod \"kserve-controller-manager-7c68cb4fc8-xnxw9\" (UID: \"81ac8e20-f6b4-437d-a5da-95ea69a24a97\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:03.029022 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.029003 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ac8e20-f6b4-437d-a5da-95ea69a24a97-cert\") pod \"kserve-controller-manager-7c68cb4fc8-xnxw9\" (UID: \"81ac8e20-f6b4-437d-a5da-95ea69a24a97\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:03.035764 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.035706 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9glj7\" (UniqueName: \"kubernetes.io/projected/81ac8e20-f6b4-437d-a5da-95ea69a24a97-kube-api-access-9glj7\") pod \"kserve-controller-manager-7c68cb4fc8-xnxw9\" (UID: \"81ac8e20-f6b4-437d-a5da-95ea69a24a97\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:03.128043 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.128010 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert\") pod \"1fa173dd-2ccd-4207-8239-80027a30a8f7\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " Apr 16 16:12:03.128043 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.128046 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnsql\" (UniqueName: \"kubernetes.io/projected/1fa173dd-2ccd-4207-8239-80027a30a8f7-kube-api-access-rnsql\") pod \"1fa173dd-2ccd-4207-8239-80027a30a8f7\" (UID: \"1fa173dd-2ccd-4207-8239-80027a30a8f7\") " Apr 16 16:12:03.130232 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.130205 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert" (OuterVolumeSpecName: "cert") pod "1fa173dd-2ccd-4207-8239-80027a30a8f7" (UID: "1fa173dd-2ccd-4207-8239-80027a30a8f7"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:12:03.130302 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.130225 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa173dd-2ccd-4207-8239-80027a30a8f7-kube-api-access-rnsql" (OuterVolumeSpecName: "kube-api-access-rnsql") pod "1fa173dd-2ccd-4207-8239-80027a30a8f7" (UID: "1fa173dd-2ccd-4207-8239-80027a30a8f7"). InnerVolumeSpecName "kube-api-access-rnsql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:12:03.169043 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.169010 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:03.229400 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.229357 2581 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fa173dd-2ccd-4207-8239-80027a30a8f7-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:12:03.229557 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.229413 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rnsql\" (UniqueName: \"kubernetes.io/projected/1fa173dd-2ccd-4207-8239-80027a30a8f7-kube-api-access-rnsql\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:12:03.289543 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.289518 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-xnxw9"] Apr 16 16:12:03.291896 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:12:03.291864 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81ac8e20_f6b4_437d_a5da_95ea69a24a97.slice/crio-6cba1b8b2471cf508a870a717c4eecdfb2b3b5c7bb21b18bc6e1a6fb12f8a8aa WatchSource:0}: Error finding container 6cba1b8b2471cf508a870a717c4eecdfb2b3b5c7bb21b18bc6e1a6fb12f8a8aa: Status 404 returned error can't find the container with id 6cba1b8b2471cf508a870a717c4eecdfb2b3b5c7bb21b18bc6e1a6fb12f8a8aa Apr 16 16:12:03.898187 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.898152 2581 generic.go:358] "Generic (PLEG): container finished" podID="1fa173dd-2ccd-4207-8239-80027a30a8f7" containerID="75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c" exitCode=0 Apr 16 16:12:03.898694 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.898215 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" Apr 16 16:12:03.898694 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.898235 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" event={"ID":"1fa173dd-2ccd-4207-8239-80027a30a8f7","Type":"ContainerDied","Data":"75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c"} Apr 16 16:12:03.898694 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.898274 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-cxcmv" event={"ID":"1fa173dd-2ccd-4207-8239-80027a30a8f7","Type":"ContainerDied","Data":"ac7684a0bc20a73056f4627b6036b6679421a68aa67eae6d4de1be747c4cd6d1"} Apr 16 16:12:03.898694 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.898295 2581 scope.go:117] "RemoveContainer" containerID="75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c" Apr 16 16:12:03.899782 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.899758 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" event={"ID":"81ac8e20-f6b4-437d-a5da-95ea69a24a97","Type":"ContainerStarted","Data":"2c328e8f6f45f1f211be3026a484c9517f29693a4358bd8e5fc75a8be3546232"} Apr 16 16:12:03.899885 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.899794 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" event={"ID":"81ac8e20-f6b4-437d-a5da-95ea69a24a97","Type":"ContainerStarted","Data":"6cba1b8b2471cf508a870a717c4eecdfb2b3b5c7bb21b18bc6e1a6fb12f8a8aa"} Apr 16 16:12:03.899978 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.899964 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:03.906083 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.906060 2581 scope.go:117] "RemoveContainer" containerID="75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c" Apr 16 16:12:03.906418 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:12:03.906393 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c\": container with ID starting with 75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c not found: ID does not exist" containerID="75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c" Apr 16 16:12:03.906498 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.906424 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c"} err="failed to get container status \"75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c\": rpc error: code = NotFound desc = could not find container \"75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c\": container with ID starting with 75a5faf5418dbd970b7e2a6123c4da52ff45d2a595b50bfbc31cf81d81f39c2c not found: ID does not exist" Apr 16 16:12:03.918113 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.918067 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" podStartSLOduration=1.492926278 podStartE2EDuration="1.918054497s" podCreationTimestamp="2026-04-16 16:12:02 +0000 UTC" firstStartedPulling="2026-04-16 16:12:03.293433645 +0000 UTC m=+551.470494677" lastFinishedPulling="2026-04-16 16:12:03.718561863 +0000 UTC m=+551.895622896" observedRunningTime="2026-04-16 16:12:03.91645849 +0000 UTC m=+552.093519556" watchObservedRunningTime="2026-04-16 16:12:03.918054497 +0000 UTC m=+552.095115552" Apr 16 16:12:03.931714 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.931690 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-cxcmv"] Apr 16 16:12:03.937879 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:03.937842 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-cxcmv"] Apr 16 16:12:04.375807 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:04.375773 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa173dd-2ccd-4207-8239-80027a30a8f7" path="/var/lib/kubelet/pods/1fa173dd-2ccd-4207-8239-80027a30a8f7/volumes" Apr 16 16:12:34.908255 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:34.908223 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-xnxw9" Apr 16 16:12:35.761843 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.761794 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-tbm4r"] Apr 16 16:12:35.762083 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.762071 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fa173dd-2ccd-4207-8239-80027a30a8f7" containerName="manager" Apr 16 16:12:35.762128 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.762084 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa173dd-2ccd-4207-8239-80027a30a8f7" containerName="manager" Apr 16 16:12:35.762168 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.762140 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fa173dd-2ccd-4207-8239-80027a30a8f7" containerName="manager" Apr 16 16:12:35.764880 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.764862 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:35.766995 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.766961 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 16:12:35.767138 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.767012 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-d9ft9\"" Apr 16 16:12:35.773634 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.773608 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tbm4r"] Apr 16 16:12:35.776437 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.776417 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-76ck7"] Apr 16 16:12:35.779570 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.779552 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:35.781660 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.781641 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 16:12:35.781660 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.781656 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-st8c4\"" Apr 16 16:12:35.787826 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.787805 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-76ck7"] Apr 16 16:12:35.856716 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.856679 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17995599-6c5a-4b44-9ee4-7a83662efb06-cert\") pod \"odh-model-controller-696fc77849-76ck7\" (UID: \"17995599-6c5a-4b44-9ee4-7a83662efb06\") " pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:35.856716 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.856719 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nc8p\" (UniqueName: \"kubernetes.io/projected/17995599-6c5a-4b44-9ee4-7a83662efb06-kube-api-access-4nc8p\") pod \"odh-model-controller-696fc77849-76ck7\" (UID: \"17995599-6c5a-4b44-9ee4-7a83662efb06\") " pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:35.856966 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.856750 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq7x\" (UniqueName: \"kubernetes.io/projected/7efe7390-4de7-4f12-908a-334e6c6fe696-kube-api-access-pmq7x\") pod \"model-serving-api-86f7b4b499-tbm4r\" (UID: \"7efe7390-4de7-4f12-908a-334e6c6fe696\") " pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:35.856966 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.856835 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7efe7390-4de7-4f12-908a-334e6c6fe696-tls-certs\") pod \"model-serving-api-86f7b4b499-tbm4r\" (UID: \"7efe7390-4de7-4f12-908a-334e6c6fe696\") " pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:35.957223 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.957186 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7efe7390-4de7-4f12-908a-334e6c6fe696-tls-certs\") pod \"model-serving-api-86f7b4b499-tbm4r\" (UID: \"7efe7390-4de7-4f12-908a-334e6c6fe696\") " pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:35.957223 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.957226 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17995599-6c5a-4b44-9ee4-7a83662efb06-cert\") pod \"odh-model-controller-696fc77849-76ck7\" (UID: \"17995599-6c5a-4b44-9ee4-7a83662efb06\") " pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:35.957659 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.957252 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nc8p\" (UniqueName: \"kubernetes.io/projected/17995599-6c5a-4b44-9ee4-7a83662efb06-kube-api-access-4nc8p\") pod \"odh-model-controller-696fc77849-76ck7\" (UID: \"17995599-6c5a-4b44-9ee4-7a83662efb06\") " pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:35.957659 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.957281 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmq7x\" (UniqueName: \"kubernetes.io/projected/7efe7390-4de7-4f12-908a-334e6c6fe696-kube-api-access-pmq7x\") pod \"model-serving-api-86f7b4b499-tbm4r\" (UID: \"7efe7390-4de7-4f12-908a-334e6c6fe696\") " pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:35.957659 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:12:35.957432 2581 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 16:12:35.957659 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:12:35.957521 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17995599-6c5a-4b44-9ee4-7a83662efb06-cert podName:17995599-6c5a-4b44-9ee4-7a83662efb06 nodeName:}" failed. No retries permitted until 2026-04-16 16:12:36.457497067 +0000 UTC m=+584.634558114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17995599-6c5a-4b44-9ee4-7a83662efb06-cert") pod "odh-model-controller-696fc77849-76ck7" (UID: "17995599-6c5a-4b44-9ee4-7a83662efb06") : secret "odh-model-controller-webhook-cert" not found Apr 16 16:12:35.959671 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.959653 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7efe7390-4de7-4f12-908a-334e6c6fe696-tls-certs\") pod \"model-serving-api-86f7b4b499-tbm4r\" (UID: \"7efe7390-4de7-4f12-908a-334e6c6fe696\") " pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:35.965768 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.965742 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nc8p\" (UniqueName: \"kubernetes.io/projected/17995599-6c5a-4b44-9ee4-7a83662efb06-kube-api-access-4nc8p\") pod \"odh-model-controller-696fc77849-76ck7\" (UID: \"17995599-6c5a-4b44-9ee4-7a83662efb06\") " pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:35.966437 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:35.966419 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmq7x\" (UniqueName: \"kubernetes.io/projected/7efe7390-4de7-4f12-908a-334e6c6fe696-kube-api-access-pmq7x\") pod \"model-serving-api-86f7b4b499-tbm4r\" (UID: \"7efe7390-4de7-4f12-908a-334e6c6fe696\") " pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:36.075794 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:36.075760 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:36.193587 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:36.193560 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tbm4r"] Apr 16 16:12:36.195992 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:12:36.195965 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7efe7390_4de7_4f12_908a_334e6c6fe696.slice/crio-9adad24915f7839eb96963d4cf88d001af519515faae5fccf7ca36aeb666cde3 WatchSource:0}: Error finding container 9adad24915f7839eb96963d4cf88d001af519515faae5fccf7ca36aeb666cde3: Status 404 returned error can't find the container with id 9adad24915f7839eb96963d4cf88d001af519515faae5fccf7ca36aeb666cde3 Apr 16 16:12:36.462208 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:36.462111 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17995599-6c5a-4b44-9ee4-7a83662efb06-cert\") pod \"odh-model-controller-696fc77849-76ck7\" (UID: \"17995599-6c5a-4b44-9ee4-7a83662efb06\") " pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:36.464527 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:36.464497 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17995599-6c5a-4b44-9ee4-7a83662efb06-cert\") pod \"odh-model-controller-696fc77849-76ck7\" (UID: \"17995599-6c5a-4b44-9ee4-7a83662efb06\") " pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:36.689700 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:36.689667 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:36.834272 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:36.834176 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-76ck7"] Apr 16 16:12:36.838445 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:12:36.838402 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17995599_6c5a_4b44_9ee4_7a83662efb06.slice/crio-fd3f7cc29cd482bc6e6bdb1f0054050dac9495527741ff9a2b0dbff2c2013887 WatchSource:0}: Error finding container fd3f7cc29cd482bc6e6bdb1f0054050dac9495527741ff9a2b0dbff2c2013887: Status 404 returned error can't find the container with id fd3f7cc29cd482bc6e6bdb1f0054050dac9495527741ff9a2b0dbff2c2013887 Apr 16 16:12:36.993047 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:36.992982 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tbm4r" event={"ID":"7efe7390-4de7-4f12-908a-334e6c6fe696","Type":"ContainerStarted","Data":"9adad24915f7839eb96963d4cf88d001af519515faae5fccf7ca36aeb666cde3"} Apr 16 16:12:36.994322 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:36.994292 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-76ck7" event={"ID":"17995599-6c5a-4b44-9ee4-7a83662efb06","Type":"ContainerStarted","Data":"fd3f7cc29cd482bc6e6bdb1f0054050dac9495527741ff9a2b0dbff2c2013887"} Apr 16 16:12:40.006012 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:40.005902 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tbm4r" event={"ID":"7efe7390-4de7-4f12-908a-334e6c6fe696","Type":"ContainerStarted","Data":"32280e9655411316dafe5c089a4288fd9e5b6ca60000c64d3d620fb9bf468634"} Apr 16 16:12:40.006470 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:40.006021 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:12:40.007195 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:40.007162 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-76ck7" event={"ID":"17995599-6c5a-4b44-9ee4-7a83662efb06","Type":"ContainerStarted","Data":"d8d1c2f1eb2721ea0df939fe22211c1d4ed3a11813318a30d9e565ab9a3428c0"} Apr 16 16:12:40.007348 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:40.007309 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:40.023171 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:40.023123 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-tbm4r" podStartSLOduration=1.482597081 podStartE2EDuration="5.02310938s" podCreationTimestamp="2026-04-16 16:12:35 +0000 UTC" firstStartedPulling="2026-04-16 16:12:36.197623615 +0000 UTC m=+584.374684648" lastFinishedPulling="2026-04-16 16:12:39.73813591 +0000 UTC m=+587.915196947" observedRunningTime="2026-04-16 16:12:40.022909267 +0000 UTC m=+588.199970337" watchObservedRunningTime="2026-04-16 16:12:40.02310938 +0000 UTC m=+588.200170437" Apr 16 16:12:40.040214 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:40.040160 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-76ck7" podStartSLOduration=2.099542479 podStartE2EDuration="5.040143486s" podCreationTimestamp="2026-04-16 16:12:35 +0000 UTC" firstStartedPulling="2026-04-16 16:12:36.840161879 +0000 UTC m=+585.017222918" lastFinishedPulling="2026-04-16 16:12:39.780762879 +0000 UTC m=+587.957823925" observedRunningTime="2026-04-16 16:12:40.037769808 +0000 UTC m=+588.214830910" watchObservedRunningTime="2026-04-16 16:12:40.040143486 +0000 UTC m=+588.217204544" Apr 16 16:12:51.011816 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:51.011785 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-76ck7" Apr 16 16:12:51.013879 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:12:51.013858 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-tbm4r" Apr 16 16:13:11.417244 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.417162 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8"] Apr 16 16:13:11.422898 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.422880 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:13:11.425403 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.425365 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wqwt4\"" Apr 16 16:13:11.427181 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.427160 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8"] Apr 16 16:13:11.531257 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.531216 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1676cda-2a69-4758-a9ca-a7dd11580d8d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8\" (UID: \"b1676cda-2a69-4758-a9ca-a7dd11580d8d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:13:11.631988 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.631952 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1676cda-2a69-4758-a9ca-a7dd11580d8d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8\" (UID: \"b1676cda-2a69-4758-a9ca-a7dd11580d8d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:13:11.632300 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.632282 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1676cda-2a69-4758-a9ca-a7dd11580d8d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8\" (UID: \"b1676cda-2a69-4758-a9ca-a7dd11580d8d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:13:11.734543 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.734446 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:13:11.851785 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:11.851740 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8"] Apr 16 16:13:11.854748 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:13:11.854717 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1676cda_2a69_4758_a9ca_a7dd11580d8d.slice/crio-4304ac03793a18928c792c00a8a388ed033fdcfbbc5d8b63ab4308bf5f8d43ea WatchSource:0}: Error finding container 4304ac03793a18928c792c00a8a388ed033fdcfbbc5d8b63ab4308bf5f8d43ea: Status 404 returned error can't find the container with id 4304ac03793a18928c792c00a8a388ed033fdcfbbc5d8b63ab4308bf5f8d43ea Apr 16 16:13:12.104367 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:12.104321 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" event={"ID":"b1676cda-2a69-4758-a9ca-a7dd11580d8d","Type":"ContainerStarted","Data":"4304ac03793a18928c792c00a8a388ed033fdcfbbc5d8b63ab4308bf5f8d43ea"} Apr 16 16:13:15.116074 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:15.115987 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" event={"ID":"b1676cda-2a69-4758-a9ca-a7dd11580d8d","Type":"ContainerStarted","Data":"4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453"} Apr 16 16:13:19.130352 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:19.130304 2581 generic.go:358] "Generic (PLEG): container finished" podID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerID="4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453" exitCode=0 Apr 16 16:13:19.130726 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:19.130375 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" event={"ID":"b1676cda-2a69-4758-a9ca-a7dd11580d8d","Type":"ContainerDied","Data":"4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453"} Apr 16 16:13:33.192664 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:33.192630 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" event={"ID":"b1676cda-2a69-4758-a9ca-a7dd11580d8d","Type":"ContainerStarted","Data":"6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444"} Apr 16 16:13:35.201169 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:35.201125 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" event={"ID":"b1676cda-2a69-4758-a9ca-a7dd11580d8d","Type":"ContainerStarted","Data":"2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2"} Apr 16 16:13:35.201564 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:35.201371 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:13:35.202803 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:35.202775 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:13:35.218289 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:35.218247 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podStartSLOduration=1.4034879789999999 podStartE2EDuration="24.218232282s" podCreationTimestamp="2026-04-16 16:13:11 +0000 UTC" firstStartedPulling="2026-04-16 16:13:11.856640148 +0000 UTC m=+620.033701182" lastFinishedPulling="2026-04-16 16:13:34.671384448 +0000 UTC m=+642.848445485" observedRunningTime="2026-04-16 16:13:35.216475556 +0000 UTC m=+643.393536622" watchObservedRunningTime="2026-04-16 16:13:35.218232282 +0000 UTC m=+643.395293336" Apr 16 16:13:36.204187 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:36.204157 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:13:36.204606 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:36.204303 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:13:36.205212 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:36.205189 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:13:37.207327 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:37.207287 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:13:37.207723 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:37.207702 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:13:47.208279 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:47.208228 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:13:47.208863 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:47.208719 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:13:57.207694 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:57.207639 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:13:57.208092 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:13:57.208036 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:07.208037 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:07.207988 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:14:07.208553 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:07.208530 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:17.207745 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:17.207691 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:14:17.208234 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:17.208182 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:27.207638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:27.207593 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:14:27.208121 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:27.208078 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:37.207767 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:37.207719 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:14:37.208304 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:37.208170 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:47.208293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:47.208202 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:14:47.208722 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:47.208368 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:14:56.602725 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.602692 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8"] Apr 16 16:14:56.603102 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.602985 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" containerID="cri-o://6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444" gracePeriod=30 Apr 16 16:14:56.603163 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.603078 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" containerID="cri-o://2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2" gracePeriod=30 Apr 16 16:14:56.715651 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.715622 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g"] Apr 16 16:14:56.717818 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.717797 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:14:56.726731 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.726706 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g"] Apr 16 16:14:56.763556 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.763522 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s"] Apr 16 16:14:56.766091 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.766069 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:14:56.777601 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.777574 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s"] Apr 16 16:14:56.851769 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.851736 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a520d0f-1263-46a8-901b-93faf79b59de-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g\" (UID: \"8a520d0f-1263-46a8-901b-93faf79b59de\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:14:56.952727 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.952628 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1078936f-0170-46f2-abbe-b429e8a45718-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s\" (UID: \"1078936f-0170-46f2-abbe-b429e8a45718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:14:56.952881 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.952761 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a520d0f-1263-46a8-901b-93faf79b59de-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g\" (UID: \"8a520d0f-1263-46a8-901b-93faf79b59de\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:14:56.953115 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:56.953095 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a520d0f-1263-46a8-901b-93faf79b59de-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g\" (UID: \"8a520d0f-1263-46a8-901b-93faf79b59de\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:14:57.028705 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.028676 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:14:57.053650 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.053612 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1078936f-0170-46f2-abbe-b429e8a45718-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s\" (UID: \"1078936f-0170-46f2-abbe-b429e8a45718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:14:57.053980 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.053957 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1078936f-0170-46f2-abbe-b429e8a45718-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s\" (UID: \"1078936f-0170-46f2-abbe-b429e8a45718\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:14:57.076829 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.076801 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:14:57.163649 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.163578 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g"] Apr 16 16:14:57.166981 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:14:57.166923 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a520d0f_1263_46a8_901b_93faf79b59de.slice/crio-6dd0441282b50096fd76e33ac13060cd49f99288b4f39d288fa37debfa9c2e14 WatchSource:0}: Error finding container 6dd0441282b50096fd76e33ac13060cd49f99288b4f39d288fa37debfa9c2e14: Status 404 returned error can't find the container with id 6dd0441282b50096fd76e33ac13060cd49f99288b4f39d288fa37debfa9c2e14 Apr 16 16:14:57.207432 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.207330 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:14:57.207688 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.207664 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:57.215703 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.215677 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s"] Apr 16 16:14:57.219025 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:14:57.218997 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1078936f_0170_46f2_abbe_b429e8a45718.slice/crio-7220cb1afda51cf419cceb36640fc0d9e91b80413d49cd788ac2edf7eaac9ac2 WatchSource:0}: Error finding container 7220cb1afda51cf419cceb36640fc0d9e91b80413d49cd788ac2edf7eaac9ac2: Status 404 returned error can't find the container with id 7220cb1afda51cf419cceb36640fc0d9e91b80413d49cd788ac2edf7eaac9ac2 Apr 16 16:14:57.462362 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.462241 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" event={"ID":"8a520d0f-1263-46a8-901b-93faf79b59de","Type":"ContainerStarted","Data":"c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818"} Apr 16 16:14:57.462362 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.462288 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" event={"ID":"8a520d0f-1263-46a8-901b-93faf79b59de","Type":"ContainerStarted","Data":"6dd0441282b50096fd76e33ac13060cd49f99288b4f39d288fa37debfa9c2e14"} Apr 16 16:14:57.463738 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.463698 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" event={"ID":"1078936f-0170-46f2-abbe-b429e8a45718","Type":"ContainerStarted","Data":"dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096"} Apr 16 16:14:57.463738 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:14:57.463733 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" event={"ID":"1078936f-0170-46f2-abbe-b429e8a45718","Type":"ContainerStarted","Data":"7220cb1afda51cf419cceb36640fc0d9e91b80413d49cd788ac2edf7eaac9ac2"} Apr 16 16:15:01.476292 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:01.476259 2581 generic.go:358] "Generic (PLEG): container finished" podID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerID="6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444" exitCode=0 Apr 16 16:15:01.476664 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:01.476332 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" event={"ID":"b1676cda-2a69-4758-a9ca-a7dd11580d8d","Type":"ContainerDied","Data":"6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444"} Apr 16 16:15:01.477620 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:01.477597 2581 generic.go:358] "Generic (PLEG): container finished" podID="8a520d0f-1263-46a8-901b-93faf79b59de" containerID="c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818" exitCode=0 Apr 16 16:15:01.477725 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:01.477669 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" event={"ID":"8a520d0f-1263-46a8-901b-93faf79b59de","Type":"ContainerDied","Data":"c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818"} Apr 16 16:15:01.479089 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:01.479064 2581 generic.go:358] "Generic (PLEG): container finished" podID="1078936f-0170-46f2-abbe-b429e8a45718" containerID="dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096" exitCode=0 Apr 16 16:15:01.479186 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:01.479095 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" event={"ID":"1078936f-0170-46f2-abbe-b429e8a45718","Type":"ContainerDied","Data":"dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096"} Apr 16 16:15:02.486839 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:02.486804 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" event={"ID":"8a520d0f-1263-46a8-901b-93faf79b59de","Type":"ContainerStarted","Data":"9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f"} Apr 16 16:15:02.487383 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:02.487239 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:15:02.489087 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:02.489053 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 16:15:02.504541 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:02.504482 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podStartSLOduration=6.5044624859999995 podStartE2EDuration="6.504462486s" podCreationTimestamp="2026-04-16 16:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:15:02.503144363 +0000 UTC m=+730.680205411" watchObservedRunningTime="2026-04-16 16:15:02.504462486 +0000 UTC m=+730.681523542" Apr 16 16:15:03.493016 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:03.492972 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 16:15:07.207609 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:07.207546 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:15:07.208104 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:07.207938 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:13.494104 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:13.494048 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 16:15:17.207569 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:17.207513 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 16 16:15:17.208060 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:17.207675 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:15:17.208060 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:17.207851 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:17.208060 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:17.207924 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:15:19.550220 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:19.550184 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" event={"ID":"1078936f-0170-46f2-abbe-b429e8a45718","Type":"ContainerStarted","Data":"4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8"} Apr 16 16:15:19.550651 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:19.550519 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:15:19.551878 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:19.551853 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 16:15:19.569473 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:19.569421 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podStartSLOduration=6.494408047 podStartE2EDuration="23.569401923s" podCreationTimestamp="2026-04-16 16:14:56 +0000 UTC" firstStartedPulling="2026-04-16 16:15:01.480309381 +0000 UTC m=+729.657370413" lastFinishedPulling="2026-04-16 16:15:18.555303253 +0000 UTC m=+746.732364289" observedRunningTime="2026-04-16 16:15:19.567806771 +0000 UTC m=+747.744867826" watchObservedRunningTime="2026-04-16 16:15:19.569401923 +0000 UTC m=+747.746462982" Apr 16 16:15:20.554641 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:20.554601 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 16:15:23.493741 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:23.493700 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 16:15:27.249170 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.249146 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:15:27.265626 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.265605 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1676cda-2a69-4758-a9ca-a7dd11580d8d-kserve-provision-location\") pod \"b1676cda-2a69-4758-a9ca-a7dd11580d8d\" (UID: \"b1676cda-2a69-4758-a9ca-a7dd11580d8d\") " Apr 16 16:15:27.265886 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.265866 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1676cda-2a69-4758-a9ca-a7dd11580d8d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b1676cda-2a69-4758-a9ca-a7dd11580d8d" (UID: "b1676cda-2a69-4758-a9ca-a7dd11580d8d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:15:27.366683 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.366627 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1676cda-2a69-4758-a9ca-a7dd11580d8d-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:15:27.576558 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.576523 2581 generic.go:358] "Generic (PLEG): container finished" podID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerID="2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2" exitCode=0 Apr 16 16:15:27.576724 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.576584 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" event={"ID":"b1676cda-2a69-4758-a9ca-a7dd11580d8d","Type":"ContainerDied","Data":"2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2"} Apr 16 16:15:27.576724 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.576591 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" Apr 16 16:15:27.576724 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.576615 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" event={"ID":"b1676cda-2a69-4758-a9ca-a7dd11580d8d","Type":"ContainerDied","Data":"4304ac03793a18928c792c00a8a388ed033fdcfbbc5d8b63ab4308bf5f8d43ea"} Apr 16 16:15:27.576724 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.576631 2581 scope.go:117] "RemoveContainer" containerID="2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2" Apr 16 16:15:27.585905 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.585884 2581 scope.go:117] "RemoveContainer" containerID="6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444" Apr 16 16:15:27.593047 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.593026 2581 scope.go:117] "RemoveContainer" containerID="4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453" Apr 16 16:15:27.598512 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.598491 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8"] Apr 16 16:15:27.600395 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.600328 2581 scope.go:117] "RemoveContainer" containerID="2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2" Apr 16 16:15:27.600703 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:15:27.600662 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2\": container with ID starting with 2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2 not found: ID does not exist" containerID="2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2" Apr 16 16:15:27.600837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.600698 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2"} err="failed to get container status \"2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2\": rpc error: code = NotFound desc = could not find container \"2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2\": container with ID starting with 2cb042743e1684f721dd51eab24037f2f286b0ae93fed4ace1090dde54c09bd2 not found: ID does not exist" Apr 16 16:15:27.600837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.600724 2581 scope.go:117] "RemoveContainer" containerID="6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444" Apr 16 16:15:27.601093 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:15:27.601063 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444\": container with ID starting with 6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444 not found: ID does not exist" containerID="6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444" Apr 16 16:15:27.601157 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.601101 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444"} err="failed to get container status \"6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444\": rpc error: code = NotFound desc = could not find container \"6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444\": container with ID starting with 6de0c509febe7fa1b76ff36ddf12fd25aea7d24375d24d4f034ae11e7de24444 not found: ID does not exist" Apr 16 16:15:27.601157 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.601123 2581 scope.go:117] "RemoveContainer" containerID="4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453" Apr 16 16:15:27.601416 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:15:27.601395 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453\": container with ID starting with 4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453 not found: ID does not exist" containerID="4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453" Apr 16 16:15:27.601499 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.601425 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453"} err="failed to get container status \"4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453\": rpc error: code = NotFound desc = could not find container \"4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453\": container with ID starting with 4d6b129a71483e3c5725a1562622566ca36bf4af2e747113a017aeca05ef2453 not found: ID does not exist" Apr 16 16:15:27.602024 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:27.602006 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8"] Apr 16 16:15:28.208280 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:28.208218 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" probeResult="failure" output="Get \"http://10.133.0.22:9081/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 16 16:15:28.208280 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:28.208259 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-70fe9-predictor-667cd4c84d-rxgg8" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: i/o timeout" Apr 16 16:15:28.375602 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:28.375567 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" path="/var/lib/kubelet/pods/b1676cda-2a69-4758-a9ca-a7dd11580d8d/volumes" Apr 16 16:15:30.555192 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:30.555152 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 16:15:33.493999 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:33.493946 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 16:15:40.555398 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:40.555315 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 16:15:43.493217 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:43.493175 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 16:15:50.554978 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:50.554928 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 16:15:53.493272 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:15:53.493221 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 16:16:00.555057 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:00.555007 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 16:16:03.493553 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:03.493508 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 16:16:10.555314 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:10.555263 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 16:16:13.494595 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:13.494517 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:16:20.556413 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:20.556380 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:16:36.953042 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:36.953004 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g"] Apr 16 16:16:36.953459 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:36.953257 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" containerID="cri-o://9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f" gracePeriod=30 Apr 16 16:16:37.046534 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046499 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq"] Apr 16 16:16:37.046863 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046851 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" Apr 16 16:16:37.046906 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046865 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" Apr 16 16:16:37.046906 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046875 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="storage-initializer" Apr 16 16:16:37.046906 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046884 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="storage-initializer" Apr 16 16:16:37.046906 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046901 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" Apr 16 16:16:37.046906 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046907 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" Apr 16 16:16:37.047054 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046953 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="agent" Apr 16 16:16:37.047054 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.046964 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1676cda-2a69-4758-a9ca-a7dd11580d8d" containerName="kserve-container" Apr 16 16:16:37.050171 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.050146 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:16:37.057762 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.057736 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq"] Apr 16 16:16:37.092072 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.092042 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8"] Apr 16 16:16:37.095439 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.095418 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:16:37.100958 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.100931 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s"] Apr 16 16:16:37.101209 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.101169 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" containerID="cri-o://4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8" gracePeriod=30 Apr 16 16:16:37.104559 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.104533 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8"] Apr 16 16:16:37.115644 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.115616 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bac4d96d-acc8-46c7-8f87-1f241d316e33-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq\" (UID: \"bac4d96d-acc8-46c7-8f87-1f241d316e33\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:16:37.115796 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.115667 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72c04038-8cbe-40f0-b00f-35632444d276-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8\" (UID: \"72c04038-8cbe-40f0-b00f-35632444d276\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:16:37.216373 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.216266 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bac4d96d-acc8-46c7-8f87-1f241d316e33-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq\" (UID: \"bac4d96d-acc8-46c7-8f87-1f241d316e33\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:16:37.216547 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.216381 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72c04038-8cbe-40f0-b00f-35632444d276-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8\" (UID: \"72c04038-8cbe-40f0-b00f-35632444d276\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:16:37.216643 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.216619 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bac4d96d-acc8-46c7-8f87-1f241d316e33-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq\" (UID: \"bac4d96d-acc8-46c7-8f87-1f241d316e33\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:16:37.216778 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.216757 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72c04038-8cbe-40f0-b00f-35632444d276-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8\" (UID: \"72c04038-8cbe-40f0-b00f-35632444d276\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:16:37.361400 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.361366 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:16:37.406990 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.406958 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:16:37.493313 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.493260 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq"] Apr 16 16:16:37.495504 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:16:37.495448 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbac4d96d_acc8_46c7_8f87_1f241d316e33.slice/crio-32f631aa0f45277e70924c414678694d12efb59cd33a7a37afe0369c93910fec WatchSource:0}: Error finding container 32f631aa0f45277e70924c414678694d12efb59cd33a7a37afe0369c93910fec: Status 404 returned error can't find the container with id 32f631aa0f45277e70924c414678694d12efb59cd33a7a37afe0369c93910fec Apr 16 16:16:37.498263 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.498245 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:16:37.550699 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.550642 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8"] Apr 16 16:16:37.552719 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:16:37.552694 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72c04038_8cbe_40f0_b00f_35632444d276.slice/crio-07f5a2189d6f88fb69ac1a5d81b13215c0682ececedc58e9c3b21bb916ca8343 WatchSource:0}: Error finding container 07f5a2189d6f88fb69ac1a5d81b13215c0682ececedc58e9c3b21bb916ca8343: Status 404 returned error can't find the container with id 07f5a2189d6f88fb69ac1a5d81b13215c0682ececedc58e9c3b21bb916ca8343 Apr 16 16:16:37.793411 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.793369 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" event={"ID":"bac4d96d-acc8-46c7-8f87-1f241d316e33","Type":"ContainerStarted","Data":"f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6"} Apr 16 16:16:37.793411 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.793416 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" event={"ID":"bac4d96d-acc8-46c7-8f87-1f241d316e33","Type":"ContainerStarted","Data":"32f631aa0f45277e70924c414678694d12efb59cd33a7a37afe0369c93910fec"} Apr 16 16:16:37.794767 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.794739 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" event={"ID":"72c04038-8cbe-40f0-b00f-35632444d276","Type":"ContainerStarted","Data":"3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf"} Apr 16 16:16:37.794888 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:37.794773 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" event={"ID":"72c04038-8cbe-40f0-b00f-35632444d276","Type":"ContainerStarted","Data":"07f5a2189d6f88fb69ac1a5d81b13215c0682ececedc58e9c3b21bb916ca8343"} Apr 16 16:16:40.554960 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:40.554923 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 16:16:40.948809 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:40.948787 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:16:41.043779 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.043747 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1078936f-0170-46f2-abbe-b429e8a45718-kserve-provision-location\") pod \"1078936f-0170-46f2-abbe-b429e8a45718\" (UID: \"1078936f-0170-46f2-abbe-b429e8a45718\") " Apr 16 16:16:41.044096 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.044074 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1078936f-0170-46f2-abbe-b429e8a45718-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1078936f-0170-46f2-abbe-b429e8a45718" (UID: "1078936f-0170-46f2-abbe-b429e8a45718"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:16:41.145221 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.145189 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1078936f-0170-46f2-abbe-b429e8a45718-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:16:41.581707 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.581685 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:16:41.649586 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.649499 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a520d0f-1263-46a8-901b-93faf79b59de-kserve-provision-location\") pod \"8a520d0f-1263-46a8-901b-93faf79b59de\" (UID: \"8a520d0f-1263-46a8-901b-93faf79b59de\") " Apr 16 16:16:41.649830 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.649807 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a520d0f-1263-46a8-901b-93faf79b59de-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8a520d0f-1263-46a8-901b-93faf79b59de" (UID: "8a520d0f-1263-46a8-901b-93faf79b59de"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:16:41.750276 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.750241 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a520d0f-1263-46a8-901b-93faf79b59de-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:16:41.810317 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.810288 2581 generic.go:358] "Generic (PLEG): container finished" podID="8a520d0f-1263-46a8-901b-93faf79b59de" containerID="9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f" exitCode=0 Apr 16 16:16:41.810485 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.810375 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" Apr 16 16:16:41.810485 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.810384 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" event={"ID":"8a520d0f-1263-46a8-901b-93faf79b59de","Type":"ContainerDied","Data":"9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f"} Apr 16 16:16:41.810485 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.810425 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g" event={"ID":"8a520d0f-1263-46a8-901b-93faf79b59de","Type":"ContainerDied","Data":"6dd0441282b50096fd76e33ac13060cd49f99288b4f39d288fa37debfa9c2e14"} Apr 16 16:16:41.810485 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.810443 2581 scope.go:117] "RemoveContainer" containerID="9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f" Apr 16 16:16:41.817178 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.817151 2581 generic.go:358] "Generic (PLEG): container finished" podID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerID="f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6" exitCode=0 Apr 16 16:16:41.817293 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.817221 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" event={"ID":"bac4d96d-acc8-46c7-8f87-1f241d316e33","Type":"ContainerDied","Data":"f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6"} Apr 16 16:16:41.818864 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.818840 2581 generic.go:358] "Generic (PLEG): container finished" podID="1078936f-0170-46f2-abbe-b429e8a45718" containerID="4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8" exitCode=0 Apr 16 16:16:41.818972 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.818923 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" Apr 16 16:16:41.818972 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.818925 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" event={"ID":"1078936f-0170-46f2-abbe-b429e8a45718","Type":"ContainerDied","Data":"4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8"} Apr 16 16:16:41.818972 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.818954 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s" event={"ID":"1078936f-0170-46f2-abbe-b429e8a45718","Type":"ContainerDied","Data":"7220cb1afda51cf419cceb36640fc0d9e91b80413d49cd788ac2edf7eaac9ac2"} Apr 16 16:16:41.820472 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.820449 2581 generic.go:358] "Generic (PLEG): container finished" podID="72c04038-8cbe-40f0-b00f-35632444d276" containerID="3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf" exitCode=0 Apr 16 16:16:41.820544 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.820503 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" event={"ID":"72c04038-8cbe-40f0-b00f-35632444d276","Type":"ContainerDied","Data":"3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf"} Apr 16 16:16:41.825595 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.825478 2581 scope.go:117] "RemoveContainer" containerID="c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818" Apr 16 16:16:41.836756 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.836733 2581 scope.go:117] "RemoveContainer" containerID="9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f" Apr 16 16:16:41.837083 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:16:41.837062 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f\": container with ID starting with 9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f not found: ID does not exist" containerID="9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f" Apr 16 16:16:41.837179 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.837091 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f"} err="failed to get container status \"9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f\": rpc error: code = NotFound desc = could not find container \"9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f\": container with ID starting with 9f7ddff8eca30a2664565981e8a1465a0c1e00f02ba7f838857111e61879161f not found: ID does not exist" Apr 16 16:16:41.837179 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.837109 2581 scope.go:117] "RemoveContainer" containerID="c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818" Apr 16 16:16:41.837378 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:16:41.837358 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818\": container with ID starting with c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818 not found: ID does not exist" containerID="c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818" Apr 16 16:16:41.837431 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.837384 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818"} err="failed to get container status \"c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818\": rpc error: code = NotFound desc = could not find container \"c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818\": container with ID starting with c570de0ea50fd7be4e29dbffcb6a5d9bbb9d57680b632c44a3c04869255b5818 not found: ID does not exist" Apr 16 16:16:41.837431 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.837399 2581 scope.go:117] "RemoveContainer" containerID="4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8" Apr 16 16:16:41.848991 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.848973 2581 scope.go:117] "RemoveContainer" containerID="dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096" Apr 16 16:16:41.858150 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.858127 2581 scope.go:117] "RemoveContainer" containerID="4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8" Apr 16 16:16:41.858474 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:16:41.858454 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8\": container with ID starting with 4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8 not found: ID does not exist" containerID="4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8" Apr 16 16:16:41.858548 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.858481 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8"} err="failed to get container status \"4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8\": rpc error: code = NotFound desc = could not find container \"4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8\": container with ID starting with 4007019ef4cce0a81687ee18be56567253016c8ed73ed922a7ef31e5006f9be8 not found: ID does not exist" Apr 16 16:16:41.858548 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.858501 2581 scope.go:117] "RemoveContainer" containerID="dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096" Apr 16 16:16:41.858776 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:16:41.858758 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096\": container with ID starting with dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096 not found: ID does not exist" containerID="dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096" Apr 16 16:16:41.858824 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.858782 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096"} err="failed to get container status \"dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096\": rpc error: code = NotFound desc = could not find container \"dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096\": container with ID starting with dd3217fd3f3a7c92c24e0e3f734b7c7c29343f7eed41c03b3906f044b642e096 not found: ID does not exist" Apr 16 16:16:41.859287 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.859268 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s"] Apr 16 16:16:41.862153 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.862135 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b643-predictor-94775489-88l6s"] Apr 16 16:16:41.873746 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.873717 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g"] Apr 16 16:16:41.877397 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:41.877330 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-3b643-predictor-584c5864d8-42v2g"] Apr 16 16:16:42.375939 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.375910 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1078936f-0170-46f2-abbe-b429e8a45718" path="/var/lib/kubelet/pods/1078936f-0170-46f2-abbe-b429e8a45718/volumes" Apr 16 16:16:42.376278 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.376265 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" path="/var/lib/kubelet/pods/8a520d0f-1263-46a8-901b-93faf79b59de/volumes" Apr 16 16:16:42.825811 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.825778 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" event={"ID":"bac4d96d-acc8-46c7-8f87-1f241d316e33","Type":"ContainerStarted","Data":"dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e"} Apr 16 16:16:42.826255 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.826089 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:16:42.827811 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.827783 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 16:16:42.828303 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.828283 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" event={"ID":"72c04038-8cbe-40f0-b00f-35632444d276","Type":"ContainerStarted","Data":"463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664"} Apr 16 16:16:42.828561 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.828548 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:16:42.829541 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.829519 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 16:16:42.845295 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.845253 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podStartSLOduration=5.845240041 podStartE2EDuration="5.845240041s" podCreationTimestamp="2026-04-16 16:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:16:42.843200228 +0000 UTC m=+831.020261288" watchObservedRunningTime="2026-04-16 16:16:42.845240041 +0000 UTC m=+831.022301095" Apr 16 16:16:42.860114 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:42.860072 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podStartSLOduration=5.86005918 podStartE2EDuration="5.86005918s" podCreationTimestamp="2026-04-16 16:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:16:42.858865235 +0000 UTC m=+831.035926281" watchObservedRunningTime="2026-04-16 16:16:42.86005918 +0000 UTC m=+831.037120235" Apr 16 16:16:43.832149 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:43.832101 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 16:16:43.832548 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:43.832166 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 16:16:53.832730 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:53.832678 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 16:16:53.833146 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:16:53.832678 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 16:17:03.833176 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:03.833122 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 16:17:03.833582 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:03.833122 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 16:17:13.837434 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:13.837387 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 16:17:13.837837 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:13.837393 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 16:17:23.832521 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:23.832469 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 16:17:23.832997 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:23.832472 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 16:17:33.832208 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:33.832162 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 16:17:33.832631 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:33.832160 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 16:17:43.832596 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:43.832504 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 16:17:43.833506 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:43.833482 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:17:53.833527 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:17:53.833497 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:18:17.266907 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:17.266862 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq"] Apr 16 16:18:17.269397 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:17.267111 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" containerID="cri-o://dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e" gracePeriod=30 Apr 16 16:18:17.365998 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:17.365964 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8"] Apr 16 16:18:17.366300 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:17.366276 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" containerID="cri-o://463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664" gracePeriod=30 Apr 16 16:18:21.132045 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.132021 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:18:21.147918 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.147835 2581 generic.go:358] "Generic (PLEG): container finished" podID="72c04038-8cbe-40f0-b00f-35632444d276" containerID="463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664" exitCode=0 Apr 16 16:18:21.147918 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.147885 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" event={"ID":"72c04038-8cbe-40f0-b00f-35632444d276","Type":"ContainerDied","Data":"463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664"} Apr 16 16:18:21.147918 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.147898 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" Apr 16 16:18:21.148127 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.147919 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8" event={"ID":"72c04038-8cbe-40f0-b00f-35632444d276","Type":"ContainerDied","Data":"07f5a2189d6f88fb69ac1a5d81b13215c0682ececedc58e9c3b21bb916ca8343"} Apr 16 16:18:21.148127 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.147942 2581 scope.go:117] "RemoveContainer" containerID="463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664" Apr 16 16:18:21.157139 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.157119 2581 scope.go:117] "RemoveContainer" containerID="3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf" Apr 16 16:18:21.164208 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.164187 2581 scope.go:117] "RemoveContainer" containerID="463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664" Apr 16 16:18:21.164475 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:18:21.164457 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664\": container with ID starting with 463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664 not found: ID does not exist" containerID="463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664" Apr 16 16:18:21.164537 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.164485 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664"} err="failed to get container status \"463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664\": rpc error: code = NotFound desc = could not find container \"463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664\": container with ID starting with 463ff7908f2a3d96174a07ce5957d4de481422e8025e1b1e55700a7fbad11664 not found: ID does not exist" Apr 16 16:18:21.164537 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.164502 2581 scope.go:117] "RemoveContainer" containerID="3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf" Apr 16 16:18:21.164723 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:18:21.164706 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf\": container with ID starting with 3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf not found: ID does not exist" containerID="3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf" Apr 16 16:18:21.164763 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.164728 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf"} err="failed to get container status \"3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf\": rpc error: code = NotFound desc = could not find container \"3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf\": container with ID starting with 3f69f40df2d0afcb9cff41a45e090183f571e53459b11fa156bfd004ffbf99bf not found: ID does not exist" Apr 16 16:18:21.213321 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.213287 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72c04038-8cbe-40f0-b00f-35632444d276-kserve-provision-location\") pod \"72c04038-8cbe-40f0-b00f-35632444d276\" (UID: \"72c04038-8cbe-40f0-b00f-35632444d276\") " Apr 16 16:18:21.213624 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.213599 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72c04038-8cbe-40f0-b00f-35632444d276-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72c04038-8cbe-40f0-b00f-35632444d276" (UID: "72c04038-8cbe-40f0-b00f-35632444d276"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:18:21.313889 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.313856 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72c04038-8cbe-40f0-b00f-35632444d276-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:18:21.468704 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.468662 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8"] Apr 16 16:18:21.472146 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.472123 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-724bc-predictor-8485c796b7-nglk8"] Apr 16 16:18:21.915086 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:21.915065 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:18:22.018841 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.018802 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bac4d96d-acc8-46c7-8f87-1f241d316e33-kserve-provision-location\") pod \"bac4d96d-acc8-46c7-8f87-1f241d316e33\" (UID: \"bac4d96d-acc8-46c7-8f87-1f241d316e33\") " Apr 16 16:18:22.019161 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.019141 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bac4d96d-acc8-46c7-8f87-1f241d316e33-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bac4d96d-acc8-46c7-8f87-1f241d316e33" (UID: "bac4d96d-acc8-46c7-8f87-1f241d316e33"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:18:22.120077 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.120043 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bac4d96d-acc8-46c7-8f87-1f241d316e33-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:18:22.152632 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.152602 2581 generic.go:358] "Generic (PLEG): container finished" podID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerID="dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e" exitCode=0 Apr 16 16:18:22.153064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.152656 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" event={"ID":"bac4d96d-acc8-46c7-8f87-1f241d316e33","Type":"ContainerDied","Data":"dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e"} Apr 16 16:18:22.153064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.152670 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" Apr 16 16:18:22.153064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.152680 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq" event={"ID":"bac4d96d-acc8-46c7-8f87-1f241d316e33","Type":"ContainerDied","Data":"32f631aa0f45277e70924c414678694d12efb59cd33a7a37afe0369c93910fec"} Apr 16 16:18:22.153064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.152695 2581 scope.go:117] "RemoveContainer" containerID="dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e" Apr 16 16:18:22.160446 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.160415 2581 scope.go:117] "RemoveContainer" containerID="f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6" Apr 16 16:18:22.167360 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.167330 2581 scope.go:117] "RemoveContainer" containerID="dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e" Apr 16 16:18:22.167611 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:18:22.167593 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e\": container with ID starting with dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e not found: ID does not exist" containerID="dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e" Apr 16 16:18:22.167658 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.167619 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e"} err="failed to get container status \"dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e\": rpc error: code = NotFound desc = could not find container \"dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e\": container with ID starting with dfd7b3dd526a046ce909f229e5b3ac1a8c09f6bfa6f0e7df03bf45ad37d8725e not found: ID does not exist" Apr 16 16:18:22.167658 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.167641 2581 scope.go:117] "RemoveContainer" containerID="f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6" Apr 16 16:18:22.167888 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:18:22.167869 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6\": container with ID starting with f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6 not found: ID does not exist" containerID="f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6" Apr 16 16:18:22.167939 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.167893 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6"} err="failed to get container status \"f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6\": rpc error: code = NotFound desc = could not find container \"f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6\": container with ID starting with f66ce95050bac680fbea0fa30e863f0e8192ad43e4a0943e28d74e217d0563c6 not found: ID does not exist" Apr 16 16:18:22.172648 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.172625 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq"] Apr 16 16:18:22.175919 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.175898 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-724bc-predictor-795997bf5d-xwcpq"] Apr 16 16:18:22.376404 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.376311 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c04038-8cbe-40f0-b00f-35632444d276" path="/var/lib/kubelet/pods/72c04038-8cbe-40f0-b00f-35632444d276/volumes" Apr 16 16:18:22.376711 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:22.376695 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" path="/var/lib/kubelet/pods/bac4d96d-acc8-46c7-8f87-1f241d316e33/volumes" Apr 16 16:18:27.324562 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324530 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b"] Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324808 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="storage-initializer" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324818 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="storage-initializer" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324828 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324834 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324850 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="storage-initializer" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324856 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="storage-initializer" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324864 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324869 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324874 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324880 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324888 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="storage-initializer" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324895 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="storage-initializer" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324902 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324907 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324917 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="storage-initializer" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324923 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="storage-initializer" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324974 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a520d0f-1263-46a8-901b-93faf79b59de" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324980 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="1078936f-0170-46f2-abbe-b429e8a45718" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324988 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="72c04038-8cbe-40f0-b00f-35632444d276" containerName="kserve-container" Apr 16 16:18:27.325025 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.324995 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="bac4d96d-acc8-46c7-8f87-1f241d316e33" containerName="kserve-container" Apr 16 16:18:27.329593 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.329571 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:18:27.331823 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.331800 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wqwt4\"" Apr 16 16:18:27.337040 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.337019 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b"] Apr 16 16:18:27.357841 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.357810 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7bcf803-b1ee-4125-94ab-1012599c0fa0-kserve-provision-location\") pod \"isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b\" (UID: \"e7bcf803-b1ee-4125-94ab-1012599c0fa0\") " pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:18:27.458590 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.458551 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7bcf803-b1ee-4125-94ab-1012599c0fa0-kserve-provision-location\") pod \"isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b\" (UID: \"e7bcf803-b1ee-4125-94ab-1012599c0fa0\") " pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:18:27.458975 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.458954 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7bcf803-b1ee-4125-94ab-1012599c0fa0-kserve-provision-location\") pod \"isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b\" (UID: \"e7bcf803-b1ee-4125-94ab-1012599c0fa0\") " pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:18:27.640484 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.640389 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:18:27.759520 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:27.759485 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b"] Apr 16 16:18:27.763966 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:18:27.763934 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7bcf803_b1ee_4125_94ab_1012599c0fa0.slice/crio-f37ab06bf55d8a98387d48328e912b1ead46412b0c66df53d467e421b3fa54b4 WatchSource:0}: Error finding container f37ab06bf55d8a98387d48328e912b1ead46412b0c66df53d467e421b3fa54b4: Status 404 returned error can't find the container with id f37ab06bf55d8a98387d48328e912b1ead46412b0c66df53d467e421b3fa54b4 Apr 16 16:18:28.173303 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:28.173265 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" event={"ID":"e7bcf803-b1ee-4125-94ab-1012599c0fa0","Type":"ContainerStarted","Data":"3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f"} Apr 16 16:18:28.173303 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:28.173308 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" event={"ID":"e7bcf803-b1ee-4125-94ab-1012599c0fa0","Type":"ContainerStarted","Data":"f37ab06bf55d8a98387d48328e912b1ead46412b0c66df53d467e421b3fa54b4"} Apr 16 16:18:32.187117 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:32.187079 2581 generic.go:358] "Generic (PLEG): container finished" podID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerID="3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f" exitCode=0 Apr 16 16:18:32.187507 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:32.187152 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" event={"ID":"e7bcf803-b1ee-4125-94ab-1012599c0fa0","Type":"ContainerDied","Data":"3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f"} Apr 16 16:18:33.193033 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:33.192999 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" event={"ID":"e7bcf803-b1ee-4125-94ab-1012599c0fa0","Type":"ContainerStarted","Data":"13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4"} Apr 16 16:18:33.193033 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:33.193042 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" event={"ID":"e7bcf803-b1ee-4125-94ab-1012599c0fa0","Type":"ContainerStarted","Data":"91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590"} Apr 16 16:18:33.193517 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:33.193372 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:18:33.193517 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:33.193403 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:18:33.194793 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:33.194764 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:18:33.195399 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:33.195377 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:18:33.210655 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:33.210604 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podStartSLOduration=6.210586442 podStartE2EDuration="6.210586442s" podCreationTimestamp="2026-04-16 16:18:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:18:33.208561274 +0000 UTC m=+941.385622328" watchObservedRunningTime="2026-04-16 16:18:33.210586442 +0000 UTC m=+941.387647498" Apr 16 16:18:34.196276 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:34.196230 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:18:34.196665 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:34.196604 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:18:44.196645 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:44.196598 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:18:44.197157 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:44.196981 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:18:54.196523 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:54.196467 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:18:54.196954 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:18:54.196931 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:19:04.196742 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:04.196690 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:19:04.197195 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:04.197162 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:19:14.196470 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:14.196367 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:19:14.196849 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:14.196768 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:19:24.197092 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:24.197042 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:19:24.197656 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:24.197491 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:19:34.196859 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:34.196810 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:19:34.197392 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:34.197367 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:19:44.197544 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:44.197513 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:19:44.198026 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:44.197675 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:19:52.525308 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.525274 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b"] Apr 16 16:19:52.525693 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.525591 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" containerID="cri-o://91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590" gracePeriod=30 Apr 16 16:19:52.525742 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.525684 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" containerID="cri-o://13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4" gracePeriod=30 Apr 16 16:19:52.540659 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.540631 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g"] Apr 16 16:19:52.543978 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.543961 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:19:52.553152 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.553115 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g"] Apr 16 16:19:52.601685 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.601647 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7aea67-0135-4660-ae60-28847adb7afe-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g\" (UID: \"bc7aea67-0135-4660-ae60-28847adb7afe\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:19:52.702296 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.702254 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7aea67-0135-4660-ae60-28847adb7afe-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g\" (UID: \"bc7aea67-0135-4660-ae60-28847adb7afe\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:19:52.702660 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.702639 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7aea67-0135-4660-ae60-28847adb7afe-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g\" (UID: \"bc7aea67-0135-4660-ae60-28847adb7afe\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:19:52.854933 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.854898 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:19:52.973261 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:52.973224 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g"] Apr 16 16:19:52.978622 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:19:52.978586 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc7aea67_0135_4660_ae60_28847adb7afe.slice/crio-188f2c911bbc4c31fbf2d0d227fc0abe21fc4513c68ccbd33c7fe14194bce9e1 WatchSource:0}: Error finding container 188f2c911bbc4c31fbf2d0d227fc0abe21fc4513c68ccbd33c7fe14194bce9e1: Status 404 returned error can't find the container with id 188f2c911bbc4c31fbf2d0d227fc0abe21fc4513c68ccbd33c7fe14194bce9e1 Apr 16 16:19:53.443327 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:53.443287 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" event={"ID":"bc7aea67-0135-4660-ae60-28847adb7afe","Type":"ContainerStarted","Data":"4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf"} Apr 16 16:19:53.443551 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:53.443357 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" event={"ID":"bc7aea67-0135-4660-ae60-28847adb7afe","Type":"ContainerStarted","Data":"188f2c911bbc4c31fbf2d0d227fc0abe21fc4513c68ccbd33c7fe14194bce9e1"} Apr 16 16:19:54.196935 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:54.196882 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:19:54.197368 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:54.197179 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:19:57.458844 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:57.458807 2581 generic.go:358] "Generic (PLEG): container finished" podID="bc7aea67-0135-4660-ae60-28847adb7afe" containerID="4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf" exitCode=0 Apr 16 16:19:57.459247 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:57.458878 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" event={"ID":"bc7aea67-0135-4660-ae60-28847adb7afe","Type":"ContainerDied","Data":"4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf"} Apr 16 16:19:58.462970 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:58.462937 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" event={"ID":"bc7aea67-0135-4660-ae60-28847adb7afe","Type":"ContainerStarted","Data":"2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33"} Apr 16 16:19:58.463393 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:58.463227 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:19:58.464527 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:58.464500 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:19:58.478310 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:58.478265 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podStartSLOduration=6.478253421 podStartE2EDuration="6.478253421s" podCreationTimestamp="2026-04-16 16:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:19:58.476786086 +0000 UTC m=+1026.653847142" watchObservedRunningTime="2026-04-16 16:19:58.478253421 +0000 UTC m=+1026.655314476" Apr 16 16:19:59.467009 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:19:59.466969 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:20:01.474166 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:01.474130 2581 generic.go:358] "Generic (PLEG): container finished" podID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerID="91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590" exitCode=0 Apr 16 16:20:01.474557 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:01.474192 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" event={"ID":"e7bcf803-b1ee-4125-94ab-1012599c0fa0","Type":"ContainerDied","Data":"91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590"} Apr 16 16:20:04.197077 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:04.197025 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:20:04.197503 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:04.197387 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:20:09.467860 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:09.467810 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:20:14.196658 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:14.196604 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 16:20:14.197067 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:14.196772 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:20:14.197067 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:14.196942 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:20:14.197067 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:14.197024 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:20:19.467451 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:19.467406 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:20:23.170562 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.170540 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:20:23.254701 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.254664 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7bcf803-b1ee-4125-94ab-1012599c0fa0-kserve-provision-location\") pod \"e7bcf803-b1ee-4125-94ab-1012599c0fa0\" (UID: \"e7bcf803-b1ee-4125-94ab-1012599c0fa0\") " Apr 16 16:20:23.255018 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.254991 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7bcf803-b1ee-4125-94ab-1012599c0fa0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e7bcf803-b1ee-4125-94ab-1012599c0fa0" (UID: "e7bcf803-b1ee-4125-94ab-1012599c0fa0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:20:23.356158 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.356069 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7bcf803-b1ee-4125-94ab-1012599c0fa0-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:20:23.546620 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.546583 2581 generic.go:358] "Generic (PLEG): container finished" podID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerID="13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4" exitCode=0 Apr 16 16:20:23.546792 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.546664 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" Apr 16 16:20:23.546792 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.546662 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" event={"ID":"e7bcf803-b1ee-4125-94ab-1012599c0fa0","Type":"ContainerDied","Data":"13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4"} Apr 16 16:20:23.546792 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.546772 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b" event={"ID":"e7bcf803-b1ee-4125-94ab-1012599c0fa0","Type":"ContainerDied","Data":"f37ab06bf55d8a98387d48328e912b1ead46412b0c66df53d467e421b3fa54b4"} Apr 16 16:20:23.546792 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.546789 2581 scope.go:117] "RemoveContainer" containerID="13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4" Apr 16 16:20:23.555169 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.554972 2581 scope.go:117] "RemoveContainer" containerID="91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590" Apr 16 16:20:23.561782 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.561765 2581 scope.go:117] "RemoveContainer" containerID="3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f" Apr 16 16:20:23.568701 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.568620 2581 scope.go:117] "RemoveContainer" containerID="13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4" Apr 16 16:20:23.569008 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:20:23.568966 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4\": container with ID starting with 13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4 not found: ID does not exist" containerID="13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4" Apr 16 16:20:23.569115 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.569014 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4"} err="failed to get container status \"13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4\": rpc error: code = NotFound desc = could not find container \"13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4\": container with ID starting with 13d2165253638530b78266b68f9857984c42120b6746be9419694e53789369f4 not found: ID does not exist" Apr 16 16:20:23.569115 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.569035 2581 scope.go:117] "RemoveContainer" containerID="91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590" Apr 16 16:20:23.569310 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:20:23.569293 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590\": container with ID starting with 91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590 not found: ID does not exist" containerID="91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590" Apr 16 16:20:23.569399 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.569314 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590"} err="failed to get container status \"91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590\": rpc error: code = NotFound desc = could not find container \"91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590\": container with ID starting with 91ff818d79a1d05d2b4333dd629b59a8f400d1c5bd42cd762f7137a932107590 not found: ID does not exist" Apr 16 16:20:23.569399 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.569362 2581 scope.go:117] "RemoveContainer" containerID="3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f" Apr 16 16:20:23.569629 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:20:23.569613 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f\": container with ID starting with 3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f not found: ID does not exist" containerID="3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f" Apr 16 16:20:23.569681 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.569633 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f"} err="failed to get container status \"3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f\": rpc error: code = NotFound desc = could not find container \"3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f\": container with ID starting with 3fba99db5664f00f5f6da4f0dddb04509c272d98833707da5fb3786f7ecec96f not found: ID does not exist" Apr 16 16:20:23.569841 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.569823 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b"] Apr 16 16:20:23.572859 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:23.572832 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-74f70-predictor-86f49fcb48-hxk5b"] Apr 16 16:20:24.378615 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:24.378579 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" path="/var/lib/kubelet/pods/e7bcf803-b1ee-4125-94ab-1012599c0fa0/volumes" Apr 16 16:20:29.467299 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:29.467252 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:20:39.467580 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:39.467529 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:20:49.467767 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:49.467655 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:20:59.467036 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:20:59.466991 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:21:09.467982 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:21:09.467936 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:21:16.371933 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:21:16.371883 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:21:26.371946 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:21:26.371900 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:21:36.372265 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:21:36.372215 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:21:46.372115 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:21:46.372062 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:21:56.372088 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:21:56.372043 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:22:06.372704 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:06.372654 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:22:16.377116 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:16.377048 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:22:22.704625 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.704592 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g"] Apr 16 16:22:22.705017 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.704879 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" containerID="cri-o://2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33" gracePeriod=30 Apr 16 16:22:22.791912 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.791881 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm"] Apr 16 16:22:22.792210 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.792197 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="storage-initializer" Apr 16 16:22:22.792257 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.792213 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="storage-initializer" Apr 16 16:22:22.792257 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.792224 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" Apr 16 16:22:22.792257 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.792229 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" Apr 16 16:22:22.792365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.792267 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" Apr 16 16:22:22.792365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.792273 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" Apr 16 16:22:22.792365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.792362 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="kserve-container" Apr 16 16:22:22.792456 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.792375 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7bcf803-b1ee-4125-94ab-1012599c0fa0" containerName="agent" Apr 16 16:22:22.795265 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.795246 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:22:22.804949 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.804918 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm"] Apr 16 16:22:22.864062 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.864025 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee006b4-beab-4d69-94e9-eccc23549281-kserve-provision-location\") pod \"isvc-primary-89be61-predictor-789bf94c5d-99btm\" (UID: \"4ee006b4-beab-4d69-94e9-eccc23549281\") " pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:22:22.964871 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.964776 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee006b4-beab-4d69-94e9-eccc23549281-kserve-provision-location\") pod \"isvc-primary-89be61-predictor-789bf94c5d-99btm\" (UID: \"4ee006b4-beab-4d69-94e9-eccc23549281\") " pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:22:22.965057 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:22.965036 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee006b4-beab-4d69-94e9-eccc23549281-kserve-provision-location\") pod \"isvc-primary-89be61-predictor-789bf94c5d-99btm\" (UID: \"4ee006b4-beab-4d69-94e9-eccc23549281\") " pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:22:23.110245 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:23.110213 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:22:23.230236 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:23.230209 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm"] Apr 16 16:22:23.232715 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:22:23.232683 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee006b4_beab_4d69_94e9_eccc23549281.slice/crio-1b5af0e00bf7253182c707bb14f82112627de4f9ba6ac960b406bdc3aa8dd7b9 WatchSource:0}: Error finding container 1b5af0e00bf7253182c707bb14f82112627de4f9ba6ac960b406bdc3aa8dd7b9: Status 404 returned error can't find the container with id 1b5af0e00bf7253182c707bb14f82112627de4f9ba6ac960b406bdc3aa8dd7b9 Apr 16 16:22:23.234523 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:23.234503 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:22:23.916821 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:23.916786 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" event={"ID":"4ee006b4-beab-4d69-94e9-eccc23549281","Type":"ContainerStarted","Data":"49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a"} Apr 16 16:22:23.916821 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:23.916824 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" event={"ID":"4ee006b4-beab-4d69-94e9-eccc23549281","Type":"ContainerStarted","Data":"1b5af0e00bf7253182c707bb14f82112627de4f9ba6ac960b406bdc3aa8dd7b9"} Apr 16 16:22:26.372778 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:26.372736 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 16:22:27.930748 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:27.930715 2581 generic.go:358] "Generic (PLEG): container finished" podID="4ee006b4-beab-4d69-94e9-eccc23549281" containerID="49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a" exitCode=0 Apr 16 16:22:27.931152 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:27.930780 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" event={"ID":"4ee006b4-beab-4d69-94e9-eccc23549281","Type":"ContainerDied","Data":"49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a"} Apr 16 16:22:28.935055 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:28.935022 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" event={"ID":"4ee006b4-beab-4d69-94e9-eccc23549281","Type":"ContainerStarted","Data":"0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845"} Apr 16 16:22:28.935480 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:28.935324 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:22:28.936770 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:28.936744 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 16:22:28.950609 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:28.950567 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podStartSLOduration=6.950553669 podStartE2EDuration="6.950553669s" podCreationTimestamp="2026-04-16 16:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:22:28.949373152 +0000 UTC m=+1177.126434207" watchObservedRunningTime="2026-04-16 16:22:28.950553669 +0000 UTC m=+1177.127614785" Apr 16 16:22:29.939312 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:29.939270 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 16:22:32.257453 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.257431 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:22:32.333767 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.333677 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7aea67-0135-4660-ae60-28847adb7afe-kserve-provision-location\") pod \"bc7aea67-0135-4660-ae60-28847adb7afe\" (UID: \"bc7aea67-0135-4660-ae60-28847adb7afe\") " Apr 16 16:22:32.334022 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.334000 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc7aea67-0135-4660-ae60-28847adb7afe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc7aea67-0135-4660-ae60-28847adb7afe" (UID: "bc7aea67-0135-4660-ae60-28847adb7afe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:32.434546 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.434510 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc7aea67-0135-4660-ae60-28847adb7afe-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:22:32.949483 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.949449 2581 generic.go:358] "Generic (PLEG): container finished" podID="bc7aea67-0135-4660-ae60-28847adb7afe" containerID="2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33" exitCode=0 Apr 16 16:22:32.949656 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.949541 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" Apr 16 16:22:32.949656 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.949539 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" event={"ID":"bc7aea67-0135-4660-ae60-28847adb7afe","Type":"ContainerDied","Data":"2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33"} Apr 16 16:22:32.949656 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.949579 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g" event={"ID":"bc7aea67-0135-4660-ae60-28847adb7afe","Type":"ContainerDied","Data":"188f2c911bbc4c31fbf2d0d227fc0abe21fc4513c68ccbd33c7fe14194bce9e1"} Apr 16 16:22:32.949656 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.949594 2581 scope.go:117] "RemoveContainer" containerID="2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33" Apr 16 16:22:32.957320 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.957303 2581 scope.go:117] "RemoveContainer" containerID="4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf" Apr 16 16:22:32.964608 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.964590 2581 scope.go:117] "RemoveContainer" containerID="2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33" Apr 16 16:22:32.964882 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:22:32.964849 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33\": container with ID starting with 2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33 not found: ID does not exist" containerID="2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33" Apr 16 16:22:32.964946 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.964892 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33"} err="failed to get container status \"2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33\": rpc error: code = NotFound desc = could not find container \"2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33\": container with ID starting with 2657dc3d142f3cb41267241df331c76192ee6d4328f75525c79eb0f6a30b7f33 not found: ID does not exist" Apr 16 16:22:32.964946 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.964914 2581 scope.go:117] "RemoveContainer" containerID="4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf" Apr 16 16:22:32.965182 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:22:32.965162 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf\": container with ID starting with 4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf not found: ID does not exist" containerID="4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf" Apr 16 16:22:32.965235 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.965192 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf"} err="failed to get container status \"4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf\": rpc error: code = NotFound desc = could not find container \"4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf\": container with ID starting with 4619ff2170b841d6664d13aa14767e057d2e028f157290ee8f93c38d92082dbf not found: ID does not exist" Apr 16 16:22:32.965367 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.965350 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g"] Apr 16 16:22:32.967959 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:32.967937 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-954a2-predictor-877cc765-jf86g"] Apr 16 16:22:34.376352 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:34.376309 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" path="/var/lib/kubelet/pods/bc7aea67-0135-4660-ae60-28847adb7afe/volumes" Apr 16 16:22:39.939555 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:39.939504 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 16:22:49.939760 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:49.939707 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 16:22:59.939964 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:22:59.939920 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 16:23:09.939510 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:09.939459 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 16:23:19.939808 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:19.939758 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 16:23:29.940054 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:29.940011 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 16:23:39.941392 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:39.941357 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:23:42.933478 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.933395 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp"] Apr 16 16:23:42.933832 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.933740 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="storage-initializer" Apr 16 16:23:42.933832 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.933754 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="storage-initializer" Apr 16 16:23:42.933832 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.933771 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" Apr 16 16:23:42.933832 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.933777 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" Apr 16 16:23:42.933832 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.933832 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc7aea67-0135-4660-ae60-28847adb7afe" containerName="kserve-container" Apr 16 16:23:42.936877 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.936851 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:23:42.939161 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.939140 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-89be61-dockercfg-zvxrb\"" Apr 16 16:23:42.939299 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.939165 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-89be61\"" Apr 16 16:23:42.939299 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.939141 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 16:23:42.943147 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.943118 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp"] Apr 16 16:23:42.978647 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.978608 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e44d3b1-7732-4fc5-81d9-17269abf124a-cabundle-cert\") pod \"isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp\" (UID: \"4e44d3b1-7732-4fc5-81d9-17269abf124a\") " pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:23:42.978835 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:42.978664 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e44d3b1-7732-4fc5-81d9-17269abf124a-kserve-provision-location\") pod \"isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp\" (UID: \"4e44d3b1-7732-4fc5-81d9-17269abf124a\") " pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:23:43.079160 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:43.079117 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e44d3b1-7732-4fc5-81d9-17269abf124a-cabundle-cert\") pod \"isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp\" (UID: \"4e44d3b1-7732-4fc5-81d9-17269abf124a\") " pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:23:43.079365 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:43.079182 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e44d3b1-7732-4fc5-81d9-17269abf124a-kserve-provision-location\") pod \"isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp\" (UID: \"4e44d3b1-7732-4fc5-81d9-17269abf124a\") " pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:23:43.079677 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:43.079655 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e44d3b1-7732-4fc5-81d9-17269abf124a-kserve-provision-location\") pod \"isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp\" (UID: \"4e44d3b1-7732-4fc5-81d9-17269abf124a\") " pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:23:43.079873 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:43.079852 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e44d3b1-7732-4fc5-81d9-17269abf124a-cabundle-cert\") pod \"isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp\" (UID: \"4e44d3b1-7732-4fc5-81d9-17269abf124a\") " pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:23:43.248462 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:43.248365 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:23:43.368717 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:43.368540 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp"] Apr 16 16:23:43.371712 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:23:43.371685 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e44d3b1_7732_4fc5_81d9_17269abf124a.slice/crio-6f1ff7f44ec3a77773da6f43d25d071dfa071cb67c10938bd812678ce309d3b2 WatchSource:0}: Error finding container 6f1ff7f44ec3a77773da6f43d25d071dfa071cb67c10938bd812678ce309d3b2: Status 404 returned error can't find the container with id 6f1ff7f44ec3a77773da6f43d25d071dfa071cb67c10938bd812678ce309d3b2 Apr 16 16:23:44.177000 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:44.176964 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" event={"ID":"4e44d3b1-7732-4fc5-81d9-17269abf124a","Type":"ContainerStarted","Data":"02a0c7e6931f3a1f7fdd5430510a783cabea5d3992705a812b82b5c33757745a"} Apr 16 16:23:44.177000 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:44.177004 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" event={"ID":"4e44d3b1-7732-4fc5-81d9-17269abf124a","Type":"ContainerStarted","Data":"6f1ff7f44ec3a77773da6f43d25d071dfa071cb67c10938bd812678ce309d3b2"} Apr 16 16:23:49.194311 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:49.194282 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_4e44d3b1-7732-4fc5-81d9-17269abf124a/storage-initializer/0.log" Apr 16 16:23:49.194691 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:49.194321 2581 generic.go:358] "Generic (PLEG): container finished" podID="4e44d3b1-7732-4fc5-81d9-17269abf124a" containerID="02a0c7e6931f3a1f7fdd5430510a783cabea5d3992705a812b82b5c33757745a" exitCode=1 Apr 16 16:23:49.194691 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:49.194403 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" event={"ID":"4e44d3b1-7732-4fc5-81d9-17269abf124a","Type":"ContainerDied","Data":"02a0c7e6931f3a1f7fdd5430510a783cabea5d3992705a812b82b5c33757745a"} Apr 16 16:23:50.199174 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:50.199144 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_4e44d3b1-7732-4fc5-81d9-17269abf124a/storage-initializer/0.log" Apr 16 16:23:50.199604 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:50.199242 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" event={"ID":"4e44d3b1-7732-4fc5-81d9-17269abf124a","Type":"ContainerStarted","Data":"f6581801dc7522e16cb7b13e572818db870f629f077265e65a9e135e3a416ec2"} Apr 16 16:23:56.218751 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:56.218724 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_4e44d3b1-7732-4fc5-81d9-17269abf124a/storage-initializer/1.log" Apr 16 16:23:56.219153 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:56.219086 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_4e44d3b1-7732-4fc5-81d9-17269abf124a/storage-initializer/0.log" Apr 16 16:23:56.219153 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:56.219119 2581 generic.go:358] "Generic (PLEG): container finished" podID="4e44d3b1-7732-4fc5-81d9-17269abf124a" containerID="f6581801dc7522e16cb7b13e572818db870f629f077265e65a9e135e3a416ec2" exitCode=1 Apr 16 16:23:56.219234 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:56.219167 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" event={"ID":"4e44d3b1-7732-4fc5-81d9-17269abf124a","Type":"ContainerDied","Data":"f6581801dc7522e16cb7b13e572818db870f629f077265e65a9e135e3a416ec2"} Apr 16 16:23:56.219234 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:56.219219 2581 scope.go:117] "RemoveContainer" containerID="02a0c7e6931f3a1f7fdd5430510a783cabea5d3992705a812b82b5c33757745a" Apr 16 16:23:56.219648 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:56.219631 2581 scope.go:117] "RemoveContainer" containerID="02a0c7e6931f3a1f7fdd5430510a783cabea5d3992705a812b82b5c33757745a" Apr 16 16:23:56.229321 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:23:56.229287 2581 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_kserve-ci-e2e-test_4e44d3b1-7732-4fc5-81d9-17269abf124a_0 in pod sandbox 6f1ff7f44ec3a77773da6f43d25d071dfa071cb67c10938bd812678ce309d3b2 from index: no such id: '02a0c7e6931f3a1f7fdd5430510a783cabea5d3992705a812b82b5c33757745a'" containerID="02a0c7e6931f3a1f7fdd5430510a783cabea5d3992705a812b82b5c33757745a" Apr 16 16:23:56.229413 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:23:56.229363 2581 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_kserve-ci-e2e-test_4e44d3b1-7732-4fc5-81d9-17269abf124a_0 in pod sandbox 6f1ff7f44ec3a77773da6f43d25d071dfa071cb67c10938bd812678ce309d3b2 from index: no such id: '02a0c7e6931f3a1f7fdd5430510a783cabea5d3992705a812b82b5c33757745a'; Skipping pod \"isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_kserve-ci-e2e-test(4e44d3b1-7732-4fc5-81d9-17269abf124a)\"" logger="UnhandledError" Apr 16 16:23:56.230695 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:23:56.230676 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_kserve-ci-e2e-test(4e44d3b1-7732-4fc5-81d9-17269abf124a)\"" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" podUID="4e44d3b1-7732-4fc5-81d9-17269abf124a" Apr 16 16:23:57.223609 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:23:57.223581 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_4e44d3b1-7732-4fc5-81d9-17269abf124a/storage-initializer/1.log" Apr 16 16:24:01.006698 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.006658 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp"] Apr 16 16:24:01.060014 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.059982 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm"] Apr 16 16:24:01.060321 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.060268 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" containerID="cri-o://0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845" gracePeriod=30 Apr 16 16:24:01.106850 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.106816 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv"] Apr 16 16:24:01.111322 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.111296 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:01.113567 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.113540 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-b43ebd\"" Apr 16 16:24:01.113714 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.113646 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-b43ebd-dockercfg-v2jfj\"" Apr 16 16:24:01.119757 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.119733 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv"] Apr 16 16:24:01.154479 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.154456 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_4e44d3b1-7732-4fc5-81d9-17269abf124a/storage-initializer/1.log" Apr 16 16:24:01.154602 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.154519 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:24:01.220237 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.220201 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e44d3b1-7732-4fc5-81d9-17269abf124a-cabundle-cert\") pod \"4e44d3b1-7732-4fc5-81d9-17269abf124a\" (UID: \"4e44d3b1-7732-4fc5-81d9-17269abf124a\") " Apr 16 16:24:01.220462 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.220277 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e44d3b1-7732-4fc5-81d9-17269abf124a-kserve-provision-location\") pod \"4e44d3b1-7732-4fc5-81d9-17269abf124a\" (UID: \"4e44d3b1-7732-4fc5-81d9-17269abf124a\") " Apr 16 16:24:01.220462 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.220399 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e540668-c921-499a-b737-e8b918f2be29-kserve-provision-location\") pod \"isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv\" (UID: \"8e540668-c921-499a-b737-e8b918f2be29\") " pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:01.220462 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.220434 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e540668-c921-499a-b737-e8b918f2be29-cabundle-cert\") pod \"isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv\" (UID: \"8e540668-c921-499a-b737-e8b918f2be29\") " pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:01.220652 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.220567 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e44d3b1-7732-4fc5-81d9-17269abf124a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4e44d3b1-7732-4fc5-81d9-17269abf124a" (UID: "4e44d3b1-7732-4fc5-81d9-17269abf124a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:24:01.220709 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.220688 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e44d3b1-7732-4fc5-81d9-17269abf124a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4e44d3b1-7732-4fc5-81d9-17269abf124a" (UID: "4e44d3b1-7732-4fc5-81d9-17269abf124a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:24:01.236103 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.236077 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp_4e44d3b1-7732-4fc5-81d9-17269abf124a/storage-initializer/1.log" Apr 16 16:24:01.236260 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.236131 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" event={"ID":"4e44d3b1-7732-4fc5-81d9-17269abf124a","Type":"ContainerDied","Data":"6f1ff7f44ec3a77773da6f43d25d071dfa071cb67c10938bd812678ce309d3b2"} Apr 16 16:24:01.236260 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.236169 2581 scope.go:117] "RemoveContainer" containerID="f6581801dc7522e16cb7b13e572818db870f629f077265e65a9e135e3a416ec2" Apr 16 16:24:01.236260 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.236203 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp" Apr 16 16:24:01.267473 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.267401 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp"] Apr 16 16:24:01.269722 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.269695 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-89be61-predictor-5bdc697fb7-9ndcp"] Apr 16 16:24:01.321465 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.321426 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e540668-c921-499a-b737-e8b918f2be29-kserve-provision-location\") pod \"isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv\" (UID: \"8e540668-c921-499a-b737-e8b918f2be29\") " pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:01.321642 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.321477 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e540668-c921-499a-b737-e8b918f2be29-cabundle-cert\") pod \"isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv\" (UID: \"8e540668-c921-499a-b737-e8b918f2be29\") " pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:01.321642 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.321509 2581 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4e44d3b1-7732-4fc5-81d9-17269abf124a-cabundle-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:24:01.321642 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.321520 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e44d3b1-7732-4fc5-81d9-17269abf124a-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:24:01.321824 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.321806 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e540668-c921-499a-b737-e8b918f2be29-kserve-provision-location\") pod \"isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv\" (UID: \"8e540668-c921-499a-b737-e8b918f2be29\") " pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:01.322064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.322047 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e540668-c921-499a-b737-e8b918f2be29-cabundle-cert\") pod \"isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv\" (UID: \"8e540668-c921-499a-b737-e8b918f2be29\") " pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:01.422364 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.422308 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:01.563401 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:01.563367 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv"] Apr 16 16:24:01.566545 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:24:01.566509 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e540668_c921_499a_b737_e8b918f2be29.slice/crio-ba8ee690b0d2885d2b7e71556e6aa243b93ef729c8424dac93eac95b5bbc4cf9 WatchSource:0}: Error finding container ba8ee690b0d2885d2b7e71556e6aa243b93ef729c8424dac93eac95b5bbc4cf9: Status 404 returned error can't find the container with id ba8ee690b0d2885d2b7e71556e6aa243b93ef729c8424dac93eac95b5bbc4cf9 Apr 16 16:24:02.240855 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:02.240817 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" event={"ID":"8e540668-c921-499a-b737-e8b918f2be29","Type":"ContainerStarted","Data":"b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d"} Apr 16 16:24:02.240855 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:02.240858 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" event={"ID":"8e540668-c921-499a-b737-e8b918f2be29","Type":"ContainerStarted","Data":"ba8ee690b0d2885d2b7e71556e6aa243b93ef729c8424dac93eac95b5bbc4cf9"} Apr 16 16:24:02.380581 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:02.380538 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e44d3b1-7732-4fc5-81d9-17269abf124a" path="/var/lib/kubelet/pods/4e44d3b1-7732-4fc5-81d9-17269abf124a/volumes" Apr 16 16:24:05.603648 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:05.603618 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:24:05.652927 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:05.652830 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee006b4-beab-4d69-94e9-eccc23549281-kserve-provision-location\") pod \"4ee006b4-beab-4d69-94e9-eccc23549281\" (UID: \"4ee006b4-beab-4d69-94e9-eccc23549281\") " Apr 16 16:24:05.653232 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:05.653202 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee006b4-beab-4d69-94e9-eccc23549281-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ee006b4-beab-4d69-94e9-eccc23549281" (UID: "4ee006b4-beab-4d69-94e9-eccc23549281"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:24:05.753653 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:05.753611 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ee006b4-beab-4d69-94e9-eccc23549281-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:24:06.256208 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.256169 2581 generic.go:358] "Generic (PLEG): container finished" podID="4ee006b4-beab-4d69-94e9-eccc23549281" containerID="0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845" exitCode=0 Apr 16 16:24:06.256436 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.256256 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" Apr 16 16:24:06.256436 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.256255 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" event={"ID":"4ee006b4-beab-4d69-94e9-eccc23549281","Type":"ContainerDied","Data":"0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845"} Apr 16 16:24:06.256436 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.256298 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm" event={"ID":"4ee006b4-beab-4d69-94e9-eccc23549281","Type":"ContainerDied","Data":"1b5af0e00bf7253182c707bb14f82112627de4f9ba6ac960b406bdc3aa8dd7b9"} Apr 16 16:24:06.256436 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.256316 2581 scope.go:117] "RemoveContainer" containerID="0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845" Apr 16 16:24:06.265088 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.265071 2581 scope.go:117] "RemoveContainer" containerID="49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a" Apr 16 16:24:06.271819 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.271794 2581 scope.go:117] "RemoveContainer" containerID="0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845" Apr 16 16:24:06.272062 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:24:06.272044 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845\": container with ID starting with 0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845 not found: ID does not exist" containerID="0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845" Apr 16 16:24:06.272107 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.272072 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845"} err="failed to get container status \"0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845\": rpc error: code = NotFound desc = could not find container \"0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845\": container with ID starting with 0fb92bc9ce0efb40355de1317ed6675b1e2c87c8773a89fab7a2c1b6ae99d845 not found: ID does not exist" Apr 16 16:24:06.272107 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.272089 2581 scope.go:117] "RemoveContainer" containerID="49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a" Apr 16 16:24:06.272302 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:24:06.272284 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a\": container with ID starting with 49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a not found: ID does not exist" containerID="49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a" Apr 16 16:24:06.272374 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.272308 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a"} err="failed to get container status \"49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a\": rpc error: code = NotFound desc = could not find container \"49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a\": container with ID starting with 49c80f8c948c6f50a5dd514f194615a913cf30aad2d4f6d604eaa7799dcb8a2a not found: ID does not exist" Apr 16 16:24:06.276493 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.276473 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm"] Apr 16 16:24:06.279842 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.279821 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-89be61-predictor-789bf94c5d-99btm"] Apr 16 16:24:06.375812 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:06.375778 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" path="/var/lib/kubelet/pods/4ee006b4-beab-4d69-94e9-eccc23549281/volumes" Apr 16 16:24:08.265638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:08.265611 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv_8e540668-c921-499a-b737-e8b918f2be29/storage-initializer/0.log" Apr 16 16:24:08.266013 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:08.265649 2581 generic.go:358] "Generic (PLEG): container finished" podID="8e540668-c921-499a-b737-e8b918f2be29" containerID="b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d" exitCode=1 Apr 16 16:24:08.266013 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:08.265692 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" event={"ID":"8e540668-c921-499a-b737-e8b918f2be29","Type":"ContainerDied","Data":"b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d"} Apr 16 16:24:09.270895 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:09.270864 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv_8e540668-c921-499a-b737-e8b918f2be29/storage-initializer/0.log" Apr 16 16:24:09.271374 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:09.270943 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" event={"ID":"8e540668-c921-499a-b737-e8b918f2be29","Type":"ContainerStarted","Data":"0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7"} Apr 16 16:24:11.134066 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.134029 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv"] Apr 16 16:24:11.134654 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.134284 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" podUID="8e540668-c921-499a-b737-e8b918f2be29" containerName="storage-initializer" containerID="cri-o://0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7" gracePeriod=30 Apr 16 16:24:11.257976 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.257939 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g"] Apr 16 16:24:11.258352 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258321 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" Apr 16 16:24:11.258352 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258352 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" Apr 16 16:24:11.258441 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258361 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e44d3b1-7732-4fc5-81d9-17269abf124a" containerName="storage-initializer" Apr 16 16:24:11.258441 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258367 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e44d3b1-7732-4fc5-81d9-17269abf124a" containerName="storage-initializer" Apr 16 16:24:11.258441 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258389 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="storage-initializer" Apr 16 16:24:11.258441 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258395 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="storage-initializer" Apr 16 16:24:11.258556 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258446 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e44d3b1-7732-4fc5-81d9-17269abf124a" containerName="storage-initializer" Apr 16 16:24:11.258556 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258454 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ee006b4-beab-4d69-94e9-eccc23549281" containerName="kserve-container" Apr 16 16:24:11.258556 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258497 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e44d3b1-7732-4fc5-81d9-17269abf124a" containerName="storage-initializer" Apr 16 16:24:11.258556 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258502 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e44d3b1-7732-4fc5-81d9-17269abf124a" containerName="storage-initializer" Apr 16 16:24:11.258670 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.258570 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e44d3b1-7732-4fc5-81d9-17269abf124a" containerName="storage-initializer" Apr 16 16:24:11.261495 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.261470 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:24:11.263838 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.263818 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wqwt4\"" Apr 16 16:24:11.268503 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.268235 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g"] Apr 16 16:24:11.400039 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.399946 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19e85e6e-44d4-415b-aa6f-4bb4a8ddb850-kserve-provision-location\") pod \"raw-sklearn-5512a-predictor-79fc6d66bc-djd6g\" (UID: \"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850\") " pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:24:11.501083 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.501047 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19e85e6e-44d4-415b-aa6f-4bb4a8ddb850-kserve-provision-location\") pod \"raw-sklearn-5512a-predictor-79fc6d66bc-djd6g\" (UID: \"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850\") " pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:24:11.501445 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.501429 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19e85e6e-44d4-415b-aa6f-4bb4a8ddb850-kserve-provision-location\") pod \"raw-sklearn-5512a-predictor-79fc6d66bc-djd6g\" (UID: \"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850\") " pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:24:11.572171 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.572134 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:24:11.696314 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:11.696177 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g"] Apr 16 16:24:11.700232 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:24:11.700203 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e85e6e_44d4_415b_aa6f_4bb4a8ddb850.slice/crio-ea405db2337527573f245694f15fbb942cb9a9edc196cc2e71e062ecef9f0b1f WatchSource:0}: Error finding container ea405db2337527573f245694f15fbb942cb9a9edc196cc2e71e062ecef9f0b1f: Status 404 returned error can't find the container with id ea405db2337527573f245694f15fbb942cb9a9edc196cc2e71e062ecef9f0b1f Apr 16 16:24:12.283555 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.283517 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" event={"ID":"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850","Type":"ContainerStarted","Data":"641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d"} Apr 16 16:24:12.283555 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.283556 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" event={"ID":"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850","Type":"ContainerStarted","Data":"ea405db2337527573f245694f15fbb942cb9a9edc196cc2e71e062ecef9f0b1f"} Apr 16 16:24:12.675142 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.675118 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv_8e540668-c921-499a-b737-e8b918f2be29/storage-initializer/1.log" Apr 16 16:24:12.675496 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.675477 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv_8e540668-c921-499a-b737-e8b918f2be29/storage-initializer/0.log" Apr 16 16:24:12.675597 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.675553 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:12.811213 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.811126 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e540668-c921-499a-b737-e8b918f2be29-cabundle-cert\") pod \"8e540668-c921-499a-b737-e8b918f2be29\" (UID: \"8e540668-c921-499a-b737-e8b918f2be29\") " Apr 16 16:24:12.811213 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.811202 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e540668-c921-499a-b737-e8b918f2be29-kserve-provision-location\") pod \"8e540668-c921-499a-b737-e8b918f2be29\" (UID: \"8e540668-c921-499a-b737-e8b918f2be29\") " Apr 16 16:24:12.811487 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.811463 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e540668-c921-499a-b737-e8b918f2be29-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8e540668-c921-499a-b737-e8b918f2be29" (UID: "8e540668-c921-499a-b737-e8b918f2be29"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:24:12.811528 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.811468 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e540668-c921-499a-b737-e8b918f2be29-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "8e540668-c921-499a-b737-e8b918f2be29" (UID: "8e540668-c921-499a-b737-e8b918f2be29"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:24:12.912260 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.912222 2581 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/8e540668-c921-499a-b737-e8b918f2be29-cabundle-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:24:12.912260 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:12.912254 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8e540668-c921-499a-b737-e8b918f2be29-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:24:13.288676 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.288589 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv_8e540668-c921-499a-b737-e8b918f2be29/storage-initializer/1.log" Apr 16 16:24:13.289064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.288991 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv_8e540668-c921-499a-b737-e8b918f2be29/storage-initializer/0.log" Apr 16 16:24:13.289064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.289023 2581 generic.go:358] "Generic (PLEG): container finished" podID="8e540668-c921-499a-b737-e8b918f2be29" containerID="0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7" exitCode=1 Apr 16 16:24:13.289064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.289053 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" event={"ID":"8e540668-c921-499a-b737-e8b918f2be29","Type":"ContainerDied","Data":"0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7"} Apr 16 16:24:13.289186 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.289089 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" event={"ID":"8e540668-c921-499a-b737-e8b918f2be29","Type":"ContainerDied","Data":"ba8ee690b0d2885d2b7e71556e6aa243b93ef729c8424dac93eac95b5bbc4cf9"} Apr 16 16:24:13.289186 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.289109 2581 scope.go:117] "RemoveContainer" containerID="0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7" Apr 16 16:24:13.289186 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.289146 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv" Apr 16 16:24:13.297585 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.297568 2581 scope.go:117] "RemoveContainer" containerID="b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d" Apr 16 16:24:13.304322 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.304305 2581 scope.go:117] "RemoveContainer" containerID="0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7" Apr 16 16:24:13.304589 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:24:13.304571 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7\": container with ID starting with 0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7 not found: ID does not exist" containerID="0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7" Apr 16 16:24:13.304636 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.304599 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7"} err="failed to get container status \"0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7\": rpc error: code = NotFound desc = could not find container \"0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7\": container with ID starting with 0617f6a30167597cb0a0cbf1bc202fadc26422fd5d013ff3a8f144f134f5b5c7 not found: ID does not exist" Apr 16 16:24:13.304636 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.304617 2581 scope.go:117] "RemoveContainer" containerID="b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d" Apr 16 16:24:13.304820 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:24:13.304804 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d\": container with ID starting with b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d not found: ID does not exist" containerID="b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d" Apr 16 16:24:13.304860 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.304824 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d"} err="failed to get container status \"b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d\": rpc error: code = NotFound desc = could not find container \"b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d\": container with ID starting with b15dd32c508c57fc92e8f861840ce7e26943f81d003b70341ae6179de103501d not found: ID does not exist" Apr 16 16:24:13.323491 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.323446 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv"] Apr 16 16:24:13.325549 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:13.325521 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b43ebd-predictor-84bd96d595-89tkv"] Apr 16 16:24:14.376707 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:14.376673 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e540668-c921-499a-b737-e8b918f2be29" path="/var/lib/kubelet/pods/8e540668-c921-499a-b737-e8b918f2be29/volumes" Apr 16 16:24:16.300923 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:16.300883 2581 generic.go:358] "Generic (PLEG): container finished" podID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerID="641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d" exitCode=0 Apr 16 16:24:16.301288 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:16.300955 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" event={"ID":"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850","Type":"ContainerDied","Data":"641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d"} Apr 16 16:24:17.305277 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:17.305244 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" event={"ID":"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850","Type":"ContainerStarted","Data":"2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007"} Apr 16 16:24:17.305689 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:17.305538 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:24:17.306833 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:17.306806 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:24:17.323252 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:17.323207 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podStartSLOduration=6.323190072 podStartE2EDuration="6.323190072s" podCreationTimestamp="2026-04-16 16:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:17.322191589 +0000 UTC m=+1285.499252644" watchObservedRunningTime="2026-04-16 16:24:17.323190072 +0000 UTC m=+1285.500251127" Apr 16 16:24:18.308726 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:18.308688 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:24:28.309686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:28.309641 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:24:38.309669 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:38.309627 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:24:48.308710 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:48.308667 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:24:58.309563 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:24:58.309521 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:25:08.309470 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:08.309423 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:25:18.309421 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:18.309313 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:25:28.309809 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:28.309767 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:25:31.386944 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.386906 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g"] Apr 16 16:25:31.387357 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.387182 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" containerID="cri-o://2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007" gracePeriod=30 Apr 16 16:25:31.452767 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.452732 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c"] Apr 16 16:25:31.453071 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.453046 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e540668-c921-499a-b737-e8b918f2be29" containerName="storage-initializer" Apr 16 16:25:31.453071 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.453065 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e540668-c921-499a-b737-e8b918f2be29" containerName="storage-initializer" Apr 16 16:25:31.453150 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.453128 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e540668-c921-499a-b737-e8b918f2be29" containerName="storage-initializer" Apr 16 16:25:31.453195 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.453186 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e540668-c921-499a-b737-e8b918f2be29" containerName="storage-initializer" Apr 16 16:25:31.453229 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.453195 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e540668-c921-499a-b737-e8b918f2be29" containerName="storage-initializer" Apr 16 16:25:31.453261 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.453233 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e540668-c921-499a-b737-e8b918f2be29" containerName="storage-initializer" Apr 16 16:25:31.456176 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.456160 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:25:31.463814 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.463788 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c"] Apr 16 16:25:31.522142 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.522106 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fada71ee-dd50-43ce-8b32-6c2ae09e643a-kserve-provision-location\") pod \"raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c\" (UID: \"fada71ee-dd50-43ce-8b32-6c2ae09e643a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:25:31.622640 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.622605 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fada71ee-dd50-43ce-8b32-6c2ae09e643a-kserve-provision-location\") pod \"raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c\" (UID: \"fada71ee-dd50-43ce-8b32-6c2ae09e643a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:25:31.622953 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.622938 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fada71ee-dd50-43ce-8b32-6c2ae09e643a-kserve-provision-location\") pod \"raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c\" (UID: \"fada71ee-dd50-43ce-8b32-6c2ae09e643a\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:25:31.767942 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.767834 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:25:31.889724 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:31.889687 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c"] Apr 16 16:25:31.893406 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:25:31.893375 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfada71ee_dd50_43ce_8b32_6c2ae09e643a.slice/crio-25a498d98208d04214c86893923662f2d885b18b23d32a2570bdb36608e39ebc WatchSource:0}: Error finding container 25a498d98208d04214c86893923662f2d885b18b23d32a2570bdb36608e39ebc: Status 404 returned error can't find the container with id 25a498d98208d04214c86893923662f2d885b18b23d32a2570bdb36608e39ebc Apr 16 16:25:32.537014 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:32.536980 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" event={"ID":"fada71ee-dd50-43ce-8b32-6c2ae09e643a","Type":"ContainerStarted","Data":"e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca"} Apr 16 16:25:32.537014 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:32.537020 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" event={"ID":"fada71ee-dd50-43ce-8b32-6c2ae09e643a","Type":"ContainerStarted","Data":"25a498d98208d04214c86893923662f2d885b18b23d32a2570bdb36608e39ebc"} Apr 16 16:25:35.547104 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:35.547071 2581 generic.go:358] "Generic (PLEG): container finished" podID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerID="e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca" exitCode=0 Apr 16 16:25:35.547593 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:35.547151 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" event={"ID":"fada71ee-dd50-43ce-8b32-6c2ae09e643a","Type":"ContainerDied","Data":"e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca"} Apr 16 16:25:35.828360 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:35.828321 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:25:35.957432 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:35.957394 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19e85e6e-44d4-415b-aa6f-4bb4a8ddb850-kserve-provision-location\") pod \"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850\" (UID: \"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850\") " Apr 16 16:25:35.957736 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:35.957711 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e85e6e-44d4-415b-aa6f-4bb4a8ddb850-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" (UID: "19e85e6e-44d4-415b-aa6f-4bb4a8ddb850"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:25:36.058228 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.058201 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19e85e6e-44d4-415b-aa6f-4bb4a8ddb850-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:25:36.552946 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.552895 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" event={"ID":"fada71ee-dd50-43ce-8b32-6c2ae09e643a","Type":"ContainerStarted","Data":"6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d"} Apr 16 16:25:36.553426 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.553257 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:25:36.554329 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.554300 2581 generic.go:358] "Generic (PLEG): container finished" podID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerID="2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007" exitCode=0 Apr 16 16:25:36.554638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.554366 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" event={"ID":"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850","Type":"ContainerDied","Data":"2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007"} Apr 16 16:25:36.554638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.554375 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" Apr 16 16:25:36.554638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.554385 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g" event={"ID":"19e85e6e-44d4-415b-aa6f-4bb4a8ddb850","Type":"ContainerDied","Data":"ea405db2337527573f245694f15fbb942cb9a9edc196cc2e71e062ecef9f0b1f"} Apr 16 16:25:36.554638 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.554400 2581 scope.go:117] "RemoveContainer" containerID="2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007" Apr 16 16:25:36.554854 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.554732 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:25:36.561996 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.561979 2581 scope.go:117] "RemoveContainer" containerID="641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d" Apr 16 16:25:36.569191 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.569173 2581 scope.go:117] "RemoveContainer" containerID="2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007" Apr 16 16:25:36.569571 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:25:36.569539 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007\": container with ID starting with 2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007 not found: ID does not exist" containerID="2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007" Apr 16 16:25:36.569649 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.569580 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007"} err="failed to get container status \"2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007\": rpc error: code = NotFound desc = could not find container \"2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007\": container with ID starting with 2c6c23c99972372437076070475af57156b69d3dc53dcfd7632dfbec72326007 not found: ID does not exist" Apr 16 16:25:36.569649 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.569599 2581 scope.go:117] "RemoveContainer" containerID="641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d" Apr 16 16:25:36.569892 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:25:36.569875 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d\": container with ID starting with 641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d not found: ID does not exist" containerID="641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d" Apr 16 16:25:36.569945 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.569874 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podStartSLOduration=5.569862341 podStartE2EDuration="5.569862341s" podCreationTimestamp="2026-04-16 16:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:25:36.567808342 +0000 UTC m=+1364.744869398" watchObservedRunningTime="2026-04-16 16:25:36.569862341 +0000 UTC m=+1364.746923398" Apr 16 16:25:36.569945 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.569895 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d"} err="failed to get container status \"641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d\": rpc error: code = NotFound desc = could not find container \"641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d\": container with ID starting with 641f50a035b1ac4c818117abed8e2de97e7b91cfb68433bfff098d0dfcb07d3d not found: ID does not exist" Apr 16 16:25:36.581627 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.581597 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g"] Apr 16 16:25:36.583526 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:36.583505 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5512a-predictor-79fc6d66bc-djd6g"] Apr 16 16:25:37.559439 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:37.559403 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:25:38.376046 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:38.376012 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" path="/var/lib/kubelet/pods/19e85e6e-44d4-415b-aa6f-4bb4a8ddb850/volumes" Apr 16 16:25:47.560449 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:47.560398 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:25:57.559926 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:25:57.559882 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:26:07.559439 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:07.559393 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:26:17.560306 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:17.560257 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:26:27.559631 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:27.559583 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:26:37.560141 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:37.560098 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:26:38.372827 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:38.372787 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:26:48.375710 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:48.375622 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:26:51.604698 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:51.604661 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c"] Apr 16 16:26:51.605168 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:51.604937 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" containerID="cri-o://6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d" gracePeriod=30 Apr 16 16:26:56.143990 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.143963 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:26:56.263744 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.263648 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fada71ee-dd50-43ce-8b32-6c2ae09e643a-kserve-provision-location\") pod \"fada71ee-dd50-43ce-8b32-6c2ae09e643a\" (UID: \"fada71ee-dd50-43ce-8b32-6c2ae09e643a\") " Apr 16 16:26:56.263977 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.263952 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fada71ee-dd50-43ce-8b32-6c2ae09e643a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fada71ee-dd50-43ce-8b32-6c2ae09e643a" (UID: "fada71ee-dd50-43ce-8b32-6c2ae09e643a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:56.365011 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.364975 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fada71ee-dd50-43ce-8b32-6c2ae09e643a-kserve-provision-location\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 16 16:26:56.814794 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.814763 2581 generic.go:358] "Generic (PLEG): container finished" podID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerID="6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d" exitCode=0 Apr 16 16:26:56.814794 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.814800 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" event={"ID":"fada71ee-dd50-43ce-8b32-6c2ae09e643a","Type":"ContainerDied","Data":"6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d"} Apr 16 16:26:56.815064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.814822 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" event={"ID":"fada71ee-dd50-43ce-8b32-6c2ae09e643a","Type":"ContainerDied","Data":"25a498d98208d04214c86893923662f2d885b18b23d32a2570bdb36608e39ebc"} Apr 16 16:26:56.815064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.814842 2581 scope.go:117] "RemoveContainer" containerID="6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d" Apr 16 16:26:56.815064 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.814850 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c" Apr 16 16:26:56.822462 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.822443 2581 scope.go:117] "RemoveContainer" containerID="e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca" Apr 16 16:26:56.829063 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.829042 2581 scope.go:117] "RemoveContainer" containerID="6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d" Apr 16 16:26:56.829319 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:26:56.829288 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d\": container with ID starting with 6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d not found: ID does not exist" containerID="6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d" Apr 16 16:26:56.829472 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.829332 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d"} err="failed to get container status \"6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d\": rpc error: code = NotFound desc = could not find container \"6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d\": container with ID starting with 6deb7a4c5e7cbc624df61f5764992574a4bf6bbfb763efc189b0edab4e8da95d not found: ID does not exist" Apr 16 16:26:56.829472 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.829418 2581 scope.go:117] "RemoveContainer" containerID="e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca" Apr 16 16:26:56.829807 ip-10-0-135-144 kubenswrapper[2581]: E0416 16:26:56.829758 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca\": container with ID starting with e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca not found: ID does not exist" containerID="e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca" Apr 16 16:26:56.829921 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.829800 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca"} err="failed to get container status \"e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca\": rpc error: code = NotFound desc = could not find container \"e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca\": container with ID starting with e74ed57fb0b83ae97f799ec533430dca9068a1e4fd10f0c28fbf5a36b6d4ecca not found: ID does not exist" Apr 16 16:26:56.831389 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.831368 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c"] Apr 16 16:26:56.833720 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:56.833698 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-82bab-predictor-c74656cbf-cdn6c"] Apr 16 16:26:58.375150 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:26:58.375110 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" path="/var/lib/kubelet/pods/fada71ee-dd50-43ce-8b32-6c2ae09e643a/volumes" Apr 16 16:27:19.693430 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:19.693398 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-r8wpm_487d3578-03cc-4a7d-9a13-eb666eaf2cf2/global-pull-secret-syncer/0.log" Apr 16 16:27:19.821191 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:19.821160 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cqtjh_ef56dfc1-1254-428e-ba95-899e4b0e0908/konnectivity-agent/0.log" Apr 16 16:27:19.887146 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:19.887114 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-144.ec2.internal_bb09a532e924a099c0b64a77b795e41f/haproxy/0.log" Apr 16 16:27:23.558589 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:23.558559 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-857bc6d45b-7q7v8_58d82fc7-9e0f-4ce8-b1b0-e5d95ef105c5/metrics-server/0.log" Apr 16 16:27:23.697950 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:23.697921 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9wfjm_a6e16886-bf74-4956-bc2c-ec8432af1f06/node-exporter/0.log" Apr 16 16:27:23.721244 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:23.721217 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9wfjm_a6e16886-bf74-4956-bc2c-ec8432af1f06/kube-rbac-proxy/0.log" Apr 16 16:27:23.741969 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:23.741944 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9wfjm_a6e16886-bf74-4956-bc2c-ec8432af1f06/init-textfile/0.log" Apr 16 16:27:23.854317 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:23.854240 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4f9bt_792a7b05-8abe-4340-8ed9-d8a1811f25f6/kube-rbac-proxy-main/0.log" Apr 16 16:27:23.875032 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:23.875001 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4f9bt_792a7b05-8abe-4340-8ed9-d8a1811f25f6/kube-rbac-proxy-self/0.log" Apr 16 16:27:23.894728 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:23.894702 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4f9bt_792a7b05-8abe-4340-8ed9-d8a1811f25f6/openshift-state-metrics/0.log" Apr 16 16:27:24.158921 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:24.158843 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-d27hf_e2452edd-6580-48d0-aa50-8d8eda8abc6f/prometheus-operator-admission-webhook/0.log" Apr 16 16:27:26.988270 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988237 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6"] Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988544 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="storage-initializer" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988555 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="storage-initializer" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988563 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="storage-initializer" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988569 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="storage-initializer" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988582 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988588 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988599 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988605 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988652 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="19e85e6e-44d4-415b-aa6f-4bb4a8ddb850" containerName="kserve-container" Apr 16 16:27:26.988686 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.988660 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fada71ee-dd50-43ce-8b32-6c2ae09e643a" containerName="kserve-container" Apr 16 16:27:26.990879 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.990855 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:26.993267 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.993242 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4jnll\"/\"kube-root-ca.crt\"" Apr 16 16:27:26.993435 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.993348 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4jnll\"/\"openshift-service-ca.crt\"" Apr 16 16:27:26.994166 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.994143 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4jnll\"/\"default-dockercfg-bq59w\"" Apr 16 16:27:26.998609 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:26.998585 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6"] Apr 16 16:27:27.105494 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.105443 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-proc\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.105494 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.105484 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-sys\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.105494 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.105505 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-lib-modules\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.105772 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.105607 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-podres\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.105772 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.105668 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp25f\" (UniqueName: \"kubernetes.io/projected/7594ba30-ff92-4156-85a3-ca48fd708b78-kube-api-access-kp25f\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.206983 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.206945 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-proc\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.206983 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.206985 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-sys\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.207222 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.207015 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-lib-modules\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.207222 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.207052 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-podres\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.207222 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.207080 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-sys\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.207222 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.207103 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp25f\" (UniqueName: \"kubernetes.io/projected/7594ba30-ff92-4156-85a3-ca48fd708b78-kube-api-access-kp25f\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.207222 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.207080 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-proc\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.207222 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.207199 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-podres\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.207222 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.207212 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7594ba30-ff92-4156-85a3-ca48fd708b78-lib-modules\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.215404 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.215384 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp25f\" (UniqueName: \"kubernetes.io/projected/7594ba30-ff92-4156-85a3-ca48fd708b78-kube-api-access-kp25f\") pod \"perf-node-gather-daemonset-d2sz6\" (UID: \"7594ba30-ff92-4156-85a3-ca48fd708b78\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.302381 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.302324 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.420092 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.420067 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6"] Apr 16 16:27:27.422624 ip-10-0-135-144 kubenswrapper[2581]: W0416 16:27:27.422598 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7594ba30_ff92_4156_85a3_ca48fd708b78.slice/crio-6b3fdcd5d74d5e78106604690414546d84d467f6376497a10674b44ce84708b5 WatchSource:0}: Error finding container 6b3fdcd5d74d5e78106604690414546d84d467f6376497a10674b44ce84708b5: Status 404 returned error can't find the container with id 6b3fdcd5d74d5e78106604690414546d84d467f6376497a10674b44ce84708b5 Apr 16 16:27:27.424174 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.424156 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:27:27.610390 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.610287 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t9bw7_2f1e4962-e3c5-4ee3-952d-c5105193db44/dns/0.log" Apr 16 16:27:27.630313 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.630287 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t9bw7_2f1e4962-e3c5-4ee3-952d-c5105193db44/kube-rbac-proxy/0.log" Apr 16 16:27:27.699436 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.699412 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sjpbs_70dfab46-81b5-47d9-b69d-a3d94a7c5e13/dns-node-resolver/0.log" Apr 16 16:27:27.911871 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.911783 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" event={"ID":"7594ba30-ff92-4156-85a3-ca48fd708b78","Type":"ContainerStarted","Data":"85a5696916e203b0877470f07fffb4a22e89e460bba78decfe95514a56a8e7a3"} Apr 16 16:27:27.911871 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.911820 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" event={"ID":"7594ba30-ff92-4156-85a3-ca48fd708b78","Type":"ContainerStarted","Data":"6b3fdcd5d74d5e78106604690414546d84d467f6376497a10674b44ce84708b5"} Apr 16 16:27:27.912099 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.911904 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:27.928264 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:27.928216 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" podStartSLOduration=1.9281991760000001 podStartE2EDuration="1.928199176s" podCreationTimestamp="2026-04-16 16:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:27:27.925861879 +0000 UTC m=+1476.102922934" watchObservedRunningTime="2026-04-16 16:27:27.928199176 +0000 UTC m=+1476.105260243" Apr 16 16:27:28.128843 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:28.128805 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t5mlw_d19613ed-0faf-481f-bc0d-b4f8fcf0f259/node-ca/0.log" Apr 16 16:27:29.215782 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:29.215755 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dlmbb_59a52d95-8872-4fa3-b620-5f948a8a6e16/serve-healthcheck-canary/0.log" Apr 16 16:27:29.645644 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:29.645615 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5gnbb_29fd374c-3a57-4945-9494-014bf8f71730/kube-rbac-proxy/0.log" Apr 16 16:27:29.664751 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:29.664727 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5gnbb_29fd374c-3a57-4945-9494-014bf8f71730/exporter/0.log" Apr 16 16:27:29.684077 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:29.684047 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5gnbb_29fd374c-3a57-4945-9494-014bf8f71730/extractor/0.log" Apr 16 16:27:31.719929 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:31.719901 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7c68cb4fc8-xnxw9_81ac8e20-f6b4-437d-a5da-95ea69a24a97/manager/0.log" Apr 16 16:27:31.760734 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:31.760702 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-tbm4r_7efe7390-4de7-4f12-908a-334e6c6fe696/server/0.log" Apr 16 16:27:31.833535 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:31.833507 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-76ck7_17995599-6c5a-4b44-9ee4-7a83662efb06/manager/0.log" Apr 16 16:27:31.877862 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:31.877835 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-85qhn_c6ec5fcc-e19c-4b4f-9e6a-205f0f34d2bf/seaweedfs/0.log" Apr 16 16:27:33.925382 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:33.925331 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-d2sz6" Apr 16 16:27:37.164453 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.164421 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ml8fq_fe31cde4-f24b-44d8-9e19-ba426c58b544/kube-multus-additional-cni-plugins/0.log" Apr 16 16:27:37.184083 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.184053 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ml8fq_fe31cde4-f24b-44d8-9e19-ba426c58b544/egress-router-binary-copy/0.log" Apr 16 16:27:37.202248 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.202219 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ml8fq_fe31cde4-f24b-44d8-9e19-ba426c58b544/cni-plugins/0.log" Apr 16 16:27:37.221017 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.220992 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ml8fq_fe31cde4-f24b-44d8-9e19-ba426c58b544/bond-cni-plugin/0.log" Apr 16 16:27:37.240124 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.240098 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ml8fq_fe31cde4-f24b-44d8-9e19-ba426c58b544/routeoverride-cni/0.log" Apr 16 16:27:37.259844 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.259819 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ml8fq_fe31cde4-f24b-44d8-9e19-ba426c58b544/whereabouts-cni-bincopy/0.log" Apr 16 16:27:37.278278 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.278252 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ml8fq_fe31cde4-f24b-44d8-9e19-ba426c58b544/whereabouts-cni/0.log" Apr 16 16:27:37.460908 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.460839 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fph7l_40d4bde4-af0f-486a-90e6-101c53fe3e24/kube-multus/0.log" Apr 16 16:27:37.632219 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.632192 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rgfkx_0aa611e2-18d8-4712-9938-e8c21daeb1a0/network-metrics-daemon/0.log" Apr 16 16:27:37.651582 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:37.651550 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rgfkx_0aa611e2-18d8-4712-9938-e8c21daeb1a0/kube-rbac-proxy/0.log" Apr 16 16:27:38.951574 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:38.951548 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rdqjm_2c718a57-bd23-432b-bf19-493fd2ad600a/ovn-controller/0.log" Apr 16 16:27:38.975483 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:38.975460 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rdqjm_2c718a57-bd23-432b-bf19-493fd2ad600a/ovn-acl-logging/0.log" Apr 16 16:27:38.997976 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:38.997954 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rdqjm_2c718a57-bd23-432b-bf19-493fd2ad600a/kube-rbac-proxy-node/0.log" Apr 16 16:27:39.019264 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:39.019241 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rdqjm_2c718a57-bd23-432b-bf19-493fd2ad600a/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 16:27:39.035863 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:39.035835 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rdqjm_2c718a57-bd23-432b-bf19-493fd2ad600a/northd/0.log" Apr 16 16:27:39.055633 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:39.055603 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rdqjm_2c718a57-bd23-432b-bf19-493fd2ad600a/nbdb/0.log" Apr 16 16:27:39.074871 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:39.074848 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rdqjm_2c718a57-bd23-432b-bf19-493fd2ad600a/sbdb/0.log" Apr 16 16:27:39.167707 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:39.167670 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rdqjm_2c718a57-bd23-432b-bf19-493fd2ad600a/ovnkube-controller/0.log" Apr 16 16:27:40.160067 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:40.160040 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-m7vqg_313de001-22f6-48de-8e2b-ba59ee1494ec/network-check-target-container/0.log" Apr 16 16:27:41.086773 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:41.086742 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-vq8l4_3143666f-7f83-4d6e-ae14-75fdaf4f8e7c/iptables-alerter/0.log" Apr 16 16:27:41.716079 ip-10-0-135-144 kubenswrapper[2581]: I0416 16:27:41.716055 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-662l5_2c3bafcf-9b23-44de-8e7b-05e8fb94b9ee/tuned/0.log"