Apr 17 07:59:42.127901 ip-10-0-128-245 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:59:42.127912 ip-10-0-128-245 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:59:42.127920 ip-10-0-128-245 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:59:42.128232 ip-10-0-128-245 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:59:52.348927 ip-10-0-128-245 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:59:52.348945 ip-10-0-128-245 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3686a5091412422e841d3cc3a61f0bff -- Apr 17 08:02:37.182254 ip-10-0-128-245 systemd[1]: Starting Kubernetes Kubelet... Apr 17 08:02:37.628117 ip-10-0-128-245 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:37.628117 ip-10-0-128-245 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 08:02:37.628117 ip-10-0-128-245 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:37.628117 ip-10-0-128-245 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 08:02:37.628117 ip-10-0-128-245 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:37.631728 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.631614 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 08:02:37.636973 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636945 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:37.636973 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636970 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:37.636973 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636973 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:37.636973 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636977 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:37.636973 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636980 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636983 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636986 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636989 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636993 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636997 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.636999 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637002 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637004 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637008 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637010 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637013 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637016 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637018 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637022 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637025 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637027 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637030 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637032 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637035 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:37.637138 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637038 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637042 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637046 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637049 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637052 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637054 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637057 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637060 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637062 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637065 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637068 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637070 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637073 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637075 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637080 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637083 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637086 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637089 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637093 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637095 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:37.637616 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637099 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637102 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637105 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637108 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637110 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637113 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637116 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637119 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637122 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637125 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637130 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637133 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637135 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637138 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637141 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637145 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637147 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637150 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637155 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:37.638198 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637159 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637163 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637167 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637174 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637178 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637182 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637185 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637188 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637191 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637194 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637197 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637199 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637202 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637205 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637207 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637210 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637213 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637216 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637219 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637221 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:37.638677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637224 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637227 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637229 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637704 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637710 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637713 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637716 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637718 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637721 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637724 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637726 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637729 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637731 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637734 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637736 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637740 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637743 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637746 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637749 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637752 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:37.639190 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637755 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637758 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637761 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637766 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637769 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637773 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637775 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637778 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637781 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637785 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637788 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637790 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637793 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637796 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637798 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637801 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637803 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637806 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637809 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:37.639680 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637811 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637814 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637817 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637820 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637823 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637825 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637828 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637831 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637833 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637843 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637846 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637849 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637852 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637854 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637857 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637859 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637861 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637864 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637867 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:37.640171 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637869 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637872 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637874 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637877 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637880 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637882 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637885 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637887 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637890 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637893 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637895 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637898 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637901 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637903 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637906 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637926 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637929 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637933 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637936 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637939 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:37.640655 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637942 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637945 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637948 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637957 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637960 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637962 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637965 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637968 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637970 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637972 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.637975 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638060 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638069 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638083 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638088 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638093 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638097 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638101 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638106 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638109 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638112 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 08:02:37.641203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638116 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638119 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638123 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638126 2580 flags.go:64] FLAG: --cgroup-root="" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638129 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638132 2580 flags.go:64] FLAG: --client-ca-file="" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638135 2580 flags.go:64] FLAG: --cloud-config="" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638143 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638146 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638153 2580 flags.go:64] FLAG: --cluster-domain="" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638156 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638160 2580 flags.go:64] FLAG: --config-dir="" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638163 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638167 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638171 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638182 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638186 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638189 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638192 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638196 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638199 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638203 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638206 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638211 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638214 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 08:02:37.641708 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638223 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638227 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638230 2580 flags.go:64] FLAG: --enable-server="true" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638233 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638240 2580 flags.go:64] FLAG: --event-burst="100" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638243 2580 flags.go:64] FLAG: --event-qps="50" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638246 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638249 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638253 2580 flags.go:64] FLAG: --eviction-hard="" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638257 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638260 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638263 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638267 2580 flags.go:64] FLAG: --eviction-soft="" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638270 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638274 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638278 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638281 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638284 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638287 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638291 2580 flags.go:64] FLAG: --feature-gates="" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638295 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638298 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638301 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638305 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638308 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 17 08:02:37.642331 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638311 2580 flags.go:64] FLAG: --help="false" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638315 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-128-245.ec2.internal" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638318 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638321 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638324 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638327 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638331 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638334 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638337 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638340 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638343 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638346 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638349 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638352 2580 flags.go:64] FLAG: --kube-reserved="" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638355 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638358 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638361 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638364 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638368 2580 flags.go:64] FLAG: --lock-file="" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638371 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638374 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638379 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638385 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 08:02:37.643075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638388 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638391 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638394 2580 flags.go:64] FLAG: --logging-format="text" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638398 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638401 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638404 2580 flags.go:64] FLAG: --manifest-url="" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638407 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638412 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638416 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638420 2580 flags.go:64] FLAG: --max-pods="110" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638423 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638426 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638429 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638433 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638436 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638439 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638442 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638450 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638453 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638456 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638460 2580 flags.go:64] FLAG: --pod-cidr="" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638463 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638470 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638473 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 08:02:37.643665 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638476 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638479 2580 flags.go:64] FLAG: --port="10250" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638482 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638485 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bdae65bc22e6c848" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638488 2580 flags.go:64] FLAG: --qos-reserved="" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638492 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638494 2580 flags.go:64] FLAG: --register-node="true" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638499 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638502 2580 flags.go:64] FLAG: --register-with-taints="" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638506 2580 flags.go:64] FLAG: --registry-burst="10" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638508 2580 flags.go:64] FLAG: --registry-qps="5" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638513 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638516 2580 flags.go:64] FLAG: --reserved-memory="" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638521 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638524 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638527 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638530 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638536 2580 flags.go:64] FLAG: --runonce="false" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638540 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638544 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638547 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638550 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638553 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638556 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638559 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638562 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 08:02:37.644269 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638565 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638568 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638571 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638575 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638578 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638581 2580 flags.go:64] FLAG: --system-cgroups="" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638584 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638589 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638592 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638595 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638599 2580 flags.go:64] FLAG: --tls-min-version="" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638602 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638605 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638609 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638612 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638615 2580 flags.go:64] FLAG: --v="2" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638620 2580 flags.go:64] FLAG: --version="false" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638626 2580 flags.go:64] FLAG: --vmodule="" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638630 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.638633 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638730 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638734 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638757 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638762 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:37.644928 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638766 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638769 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638772 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638775 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638778 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638781 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638784 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638786 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638789 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638791 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638794 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638797 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638799 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638802 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638806 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638809 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638812 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638815 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638818 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:37.645541 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638821 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638823 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638827 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638831 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638834 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638837 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638842 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638845 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638848 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638851 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638853 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638856 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638859 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638862 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638865 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638867 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638870 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638872 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638875 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638877 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:37.646076 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638880 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638882 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638885 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638888 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638890 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638893 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638895 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638898 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638901 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638903 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638906 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638924 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638927 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638930 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638933 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638936 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638939 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638941 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638945 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638948 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:37.646581 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638951 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638953 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638956 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638959 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638962 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638964 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638967 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638971 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638974 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638976 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638979 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638982 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638984 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638987 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638989 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638992 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638994 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.638997 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.639000 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:37.647101 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.639002 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.639005 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.639007 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.639010 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.639962 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.646900 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.646946 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647001 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647007 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647011 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647014 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647017 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647020 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647024 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647027 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647030 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:37.647583 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647033 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647035 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647043 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647046 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647048 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647051 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647053 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647056 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647059 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647072 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647075 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647077 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647080 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647083 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647086 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647090 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647095 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647098 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647101 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647104 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:37.648062 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647107 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647109 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647120 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647123 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647126 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647128 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647131 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647133 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647136 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647139 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647142 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647144 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647147 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647149 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647152 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647154 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647157 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647160 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647162 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:37.648566 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647165 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647167 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647170 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647172 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647174 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647177 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647179 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647182 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647185 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647188 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647190 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647192 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647195 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647197 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647200 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647203 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647212 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647215 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647217 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647220 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:37.649060 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647223 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647228 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647231 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647233 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647236 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647239 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647241 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647244 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647247 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647250 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647252 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647255 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647257 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647259 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647262 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647264 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647267 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:37.649557 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647269 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.647275 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647400 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647406 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647409 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647412 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647415 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647418 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647421 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647423 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647426 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647429 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647440 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647443 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647446 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:37.650108 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647448 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647451 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647453 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647456 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647458 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647462 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647467 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647470 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647473 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647475 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647478 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647480 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647483 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647486 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647488 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647491 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647493 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647496 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647498 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647501 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:37.650545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647503 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647507 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647511 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647514 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647518 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647521 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647524 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647527 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647530 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647533 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647542 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647545 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647548 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647550 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647553 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647555 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647558 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647561 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647564 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647567 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:37.651053 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647569 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647572 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647574 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647577 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647579 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647582 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647584 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647587 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647590 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647592 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647595 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647598 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647600 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647603 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647606 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647608 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647611 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647614 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647616 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647619 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:37.651550 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647622 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647625 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647628 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647637 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647639 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647642 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647645 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647647 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647650 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647652 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647655 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647658 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:37.647660 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.647666 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:37.652146 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.648456 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 08:02:37.652513 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.652359 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 08:02:37.653367 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.653352 2580 server.go:1019] "Starting client certificate rotation" Apr 17 08:02:37.653476 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.653455 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 08:02:37.654126 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.654111 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 08:02:37.683617 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.683578 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 08:02:37.686115 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.686085 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 08:02:37.701119 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.701083 2580 log.go:25] "Validated CRI v1 runtime API" Apr 17 08:02:37.707285 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.707259 2580 log.go:25] "Validated CRI v1 image API" Apr 17 08:02:37.708778 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.708754 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 08:02:37.710528 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.710502 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 08:02:37.713880 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.713830 2580 fs.go:135] Filesystem UUIDs: map[117cb87a-398d-4593-a061-7e51b27a97da:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9746c24b-9a9f-4761-9365-87c59f90f1f4:/dev/nvme0n1p4] Apr 17 08:02:37.713880 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.713866 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 08:02:37.721031 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.720849 2580 manager.go:217] Machine: {Timestamp:2026-04-17 08:02:37.718517321 +0000 UTC m=+0.416473998 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100036 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec204f55d0893429cdfe5e2dc002c02f SystemUUID:ec204f55-d089-3429-cdfe-5e2dc002c02f BootID:3686a509-1412-422e-841d-3cc3a61f0bff Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9e:11:72:69:1b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9e:11:72:69:1b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:0d:df:1a:f6:44 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 08:02:37.721740 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.721727 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 08:02:37.721919 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.721895 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 08:02:37.725576 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.725532 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 08:02:37.725751 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.725579 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-245.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 08:02:37.725797 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.725765 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 08:02:37.725797 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.725774 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 08:02:37.725797 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.725788 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 08:02:37.726697 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.726683 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 08:02:37.728286 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.728272 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 08:02:37.728445 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.728434 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 08:02:37.731125 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.731111 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 17 08:02:37.731181 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.731135 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 08:02:37.731181 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.731149 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 08:02:37.731181 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.731162 2580 kubelet.go:397] "Adding apiserver pod source" Apr 17 08:02:37.731181 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.731173 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 08:02:37.733706 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.733677 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 08:02:37.733706 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.733711 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 08:02:37.739710 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.739685 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 08:02:37.741419 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.741400 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 08:02:37.743309 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743296 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743316 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743322 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743328 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743334 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743340 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743346 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743353 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743361 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743368 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743377 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 08:02:37.743386 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.743386 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 08:02:37.744147 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.744134 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 08:02:37.744183 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.744148 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 08:02:37.747394 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.747164 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 08:02:37.747535 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.747257 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-245.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 08:02:37.748212 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.748198 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 08:02:37.748258 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.748250 2580 server.go:1295] "Started kubelet" Apr 17 08:02:37.748372 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.748342 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 08:02:37.748447 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.748392 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 08:02:37.748525 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.748505 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 08:02:37.749334 ip-10-0-128-245 systemd[1]: Started Kubernetes Kubelet. Apr 17 08:02:37.750211 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.750075 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 08:02:37.751702 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.751680 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 17 08:02:37.754697 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.754672 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7hpfd" Apr 17 08:02:37.755855 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.755833 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 08:02:37.756659 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.756643 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 08:02:37.757032 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.757009 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-245.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 08:02:37.757580 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.757562 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 08:02:37.757580 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.757581 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 08:02:37.757691 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.757682 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 08:02:37.757754 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.757745 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 17 08:02:37.757784 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.757756 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 17 08:02:37.757816 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.757791 2580 factory.go:153] Registering CRI-O factory Apr 17 08:02:37.757883 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.757863 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:37.757883 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.757879 2580 factory.go:223] Registration of the crio container factory successfully Apr 17 08:02:37.758252 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.758216 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 08:02:37.758252 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.758229 2580 factory.go:55] Registering systemd factory Apr 17 08:02:37.758252 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.758237 2580 factory.go:223] Registration of the systemd container factory successfully Apr 17 08:02:37.758409 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.758260 2580 factory.go:103] Registering Raw factory Apr 17 08:02:37.758409 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.758271 2580 manager.go:1196] Started watching for new ooms in manager Apr 17 08:02:37.758611 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.757134 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-245.ec2.internal.18a7162c58fbb1f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-245.ec2.internal,UID:ip-10-0-128-245.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-245.ec2.internal,},FirstTimestamp:2026-04-17 08:02:37.748212211 +0000 UTC m=+0.446168868,LastTimestamp:2026-04-17 08:02:37.748212211 +0000 UTC m=+0.446168868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-245.ec2.internal,}" Apr 17 08:02:37.758685 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.758676 2580 manager.go:319] Starting recovery of all containers Apr 17 08:02:37.759338 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.759317 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 08:02:37.761977 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.761950 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7hpfd" Apr 17 08:02:37.764211 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.764183 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-245.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 08:02:37.764351 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.764232 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 08:02:37.771152 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.770979 2580 manager.go:324] Recovery completed Apr 17 08:02:37.775456 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.775434 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:37.777898 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.777878 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:37.777998 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.777925 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:37.777998 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.777938 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:37.778499 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.778486 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 08:02:37.778562 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.778499 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 08:02:37.778562 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.778550 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 08:02:37.781930 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.781890 2580 policy_none.go:49] "None policy: Start" Apr 17 08:02:37.781930 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.781932 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 08:02:37.782086 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.781947 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 17 08:02:37.824710 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.824686 2580 manager.go:341] "Starting Device Plugin manager" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.824846 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.824865 2580 server.go:85] "Starting device plugin registration server" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.825191 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.825204 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.825307 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.825407 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.825416 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.826133 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 08:02:37.847069 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.826170 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:37.863978 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.863938 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 08:02:37.865247 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.865220 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 08:02:37.865359 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.865253 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 08:02:37.865359 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.865274 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 08:02:37.865359 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.865280 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 08:02:37.865359 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.865313 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 08:02:37.867890 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.867867 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:37.926154 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.926060 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:37.927380 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.927360 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:37.927452 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.927400 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:37.927452 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.927411 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:37.927452 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.927436 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-245.ec2.internal" Apr 17 08:02:37.937101 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.937074 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-245.ec2.internal" Apr 17 08:02:37.937169 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.937107 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-245.ec2.internal\": node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:37.951179 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.951153 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:37.965630 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.965589 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal"] Apr 17 08:02:37.965684 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.965676 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:37.966710 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.966687 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:37.966825 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.966720 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:37.966825 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.966735 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:37.969251 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.969227 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:37.969389 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.969371 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:37.969435 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.969411 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:37.970302 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.970285 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:37.970302 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.970298 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:37.970424 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.970318 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:37.970424 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.970321 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:37.970424 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.970328 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:37.970424 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.970333 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:37.972699 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.972676 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" Apr 17 08:02:37.972783 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.972722 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:37.973695 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.973675 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:37.973783 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.973712 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:37.973783 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:37.973722 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:37.998803 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:37.998778 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-245.ec2.internal\" not found" node="ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.003549 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.003527 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-245.ec2.internal\" not found" node="ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.051281 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.051231 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.059649 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.059612 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd86dab92626f6c7dc59a5d2e42a1f67-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal\" (UID: \"bd86dab92626f6c7dc59a5d2e42a1f67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.059721 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.059663 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd86dab92626f6c7dc59a5d2e42a1f67-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal\" (UID: \"bd86dab92626f6c7dc59a5d2e42a1f67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.059721 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.059684 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d0c56bbacc967c418e4c91e68e0ba0d3-config\") pod \"kube-apiserver-proxy-ip-10-0-128-245.ec2.internal\" (UID: \"d0c56bbacc967c418e4c91e68e0ba0d3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.151811 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.151774 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.160157 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.160133 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd86dab92626f6c7dc59a5d2e42a1f67-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal\" (UID: \"bd86dab92626f6c7dc59a5d2e42a1f67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.160218 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.160167 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd86dab92626f6c7dc59a5d2e42a1f67-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal\" (UID: \"bd86dab92626f6c7dc59a5d2e42a1f67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.160218 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.160185 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d0c56bbacc967c418e4c91e68e0ba0d3-config\") pod \"kube-apiserver-proxy-ip-10-0-128-245.ec2.internal\" (UID: \"d0c56bbacc967c418e4c91e68e0ba0d3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.160283 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.160232 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d0c56bbacc967c418e4c91e68e0ba0d3-config\") pod \"kube-apiserver-proxy-ip-10-0-128-245.ec2.internal\" (UID: \"d0c56bbacc967c418e4c91e68e0ba0d3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.160283 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.160237 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd86dab92626f6c7dc59a5d2e42a1f67-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal\" (UID: \"bd86dab92626f6c7dc59a5d2e42a1f67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.160283 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.160240 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd86dab92626f6c7dc59a5d2e42a1f67-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal\" (UID: \"bd86dab92626f6c7dc59a5d2e42a1f67\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.252674 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.252600 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.301123 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.301095 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.306607 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.306581 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.353705 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.353644 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.454200 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.454168 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.554892 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.554801 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.653438 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.653391 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 08:02:38.654124 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.653570 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 08:02:38.655581 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.655556 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.702961 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.702925 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:38.755891 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.755854 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.756051 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.755956 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 08:02:38.764824 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.764787 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:57:37 +0000 UTC" deadline="2028-01-21 02:05:13.154635848 +0000 UTC" Apr 17 08:02:38.764824 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.764822 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15450h2m34.389817651s" Apr 17 08:02:38.767126 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.767095 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 08:02:38.788108 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.788079 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jmkc2" Apr 17 08:02:38.793363 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.793330 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jmkc2" Apr 17 08:02:38.856540 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:38.856457 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-245.ec2.internal\" not found" Apr 17 08:02:38.913762 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:38.913706 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd86dab92626f6c7dc59a5d2e42a1f67.slice/crio-85dec64e67ba1deaa90eeefe12e5cadf7dcaa2df5352652e1ffaaa716704a525 WatchSource:0}: Error finding container 85dec64e67ba1deaa90eeefe12e5cadf7dcaa2df5352652e1ffaaa716704a525: Status 404 returned error can't find the container with id 85dec64e67ba1deaa90eeefe12e5cadf7dcaa2df5352652e1ffaaa716704a525 Apr 17 08:02:38.914099 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:38.914082 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0c56bbacc967c418e4c91e68e0ba0d3.slice/crio-b3da5f6a89cf7e5c43e74fa096600ea9fc3530d680eaa8e28b36b78249a63a80 WatchSource:0}: Error finding container b3da5f6a89cf7e5c43e74fa096600ea9fc3530d680eaa8e28b36b78249a63a80: Status 404 returned error can't find the container with id b3da5f6a89cf7e5c43e74fa096600ea9fc3530d680eaa8e28b36b78249a63a80 Apr 17 08:02:38.918212 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.918189 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:02:38.918508 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.918490 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:38.955202 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.955167 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:38.956900 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.956880 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.967413 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.967381 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 08:02:38.968353 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.968337 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" Apr 17 08:02:38.981245 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:38.981218 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 08:02:39.663804 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.663771 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:39.733325 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.733292 2580 apiserver.go:52] "Watching apiserver" Apr 17 08:02:39.738720 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.738690 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 08:02:39.741237 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.741199 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-clh84","openshift-multus/multus-additional-cni-plugins-q7p6p","openshift-multus/multus-r995s","openshift-ovn-kubernetes/ovnkube-node-wwcgr","kube-system/konnectivity-agent-ttxzs","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg","openshift-dns/node-resolver-sv4zj","openshift-image-registry/node-ca-lf6p6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal","openshift-multus/network-metrics-daemon-r9td5","openshift-network-diagnostics/network-check-target-bhnbj","openshift-network-operator/iptables-alerter-pr74b","kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal"] Apr 17 08:02:39.743812 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.743784 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:39.743992 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:39.743876 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:39.746115 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.746082 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.748106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.748023 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 08:02:39.748242 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.748124 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 08:02:39.748242 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.748137 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 08:02:39.748702 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.748439 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-d6vrr\"" Apr 17 08:02:39.748702 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.748607 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 08:02:39.748702 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.748646 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.748925 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.748648 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 08:02:39.750826 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.750661 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 08:02:39.750826 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.750661 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 08:02:39.750826 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.750720 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2wrzj\"" Apr 17 08:02:39.751073 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.750952 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 08:02:39.751073 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.750971 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 08:02:39.751073 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.751006 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.751215 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.751162 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 08:02:39.751264 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.751162 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 08:02:39.753225 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.753199 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:39.754025 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.754001 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 08:02:39.754349 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.754327 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bqp68\"" Apr 17 08:02:39.754447 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.754409 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:02:39.757336 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.755670 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 08:02:39.757336 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.755934 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 08:02:39.757336 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.756516 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ljsjb\"" Apr 17 08:02:39.760740 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.760703 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.762611 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.762586 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 08:02:39.762757 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.762661 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p566c\"" Apr 17 08:02:39.762757 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.762675 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 08:02:39.762865 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.762845 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 08:02:39.763355 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.763335 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.765087 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.765067 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 08:02:39.765296 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.765274 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 08:02:39.765381 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.765160 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lx6b2\"" Apr 17 08:02:39.765516 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.765498 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.765663 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.765646 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r995s" Apr 17 08:02:39.767238 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.767051 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 08:02:39.767364 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.767293 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 08:02:39.767803 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.767560 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lv2z6\"" Apr 17 08:02:39.767803 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.767612 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 08:02:39.767803 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.767646 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 08:02:39.767803 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.767762 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xm7h6\"" Apr 17 08:02:39.768099 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.768049 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:39.768152 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:39.768130 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:39.770109 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770085 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-systemd\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770214 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770124 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-var-lib-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770214 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770153 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770214 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770180 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-run-ovn-kubernetes\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770214 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770203 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-cni-bin\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770228 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtxv\" (UniqueName: \"kubernetes.io/projected/02718710-e78f-45e5-97ee-f802acc6c063-kube-api-access-zvtxv\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770253 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-kubelet\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770276 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-etc-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770301 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-env-overrides\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-systemd\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770349 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-lib-modules\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770372 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770386 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.770417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770396 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-os-release\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770460 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-ovn\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770486 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-log-socket\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770513 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-cni-netd\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770538 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-modprobe-d\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysconfig\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-run\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770616 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36159bce-5e37-46a6-b216-a4bc0a7e38a8-agent-certs\") pod \"konnectivity-agent-ttxzs\" (UID: \"36159bce-5e37-46a6-b216-a4bc0a7e38a8\") " pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770641 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36159bce-5e37-46a6-b216-a4bc0a7e38a8-konnectivity-ca\") pod \"konnectivity-agent-ttxzs\" (UID: \"36159bce-5e37-46a6-b216-a4bc0a7e38a8\") " pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770664 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-run-netns\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770694 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-var-lib-kubelet\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770717 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-tuned\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770740 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770764 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-device-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770795 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.770836 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770818 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqt6p\" (UniqueName: \"kubernetes.io/projected/18d4abe2-95b8-4158-acde-3d01b4526f60-kube-api-access-vqt6p\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770840 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-ovnkube-script-lib\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770862 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-kubernetes\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysctl-conf\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770952 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-host\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.770977 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-socket-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771001 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771024 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-sys-fs\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771060 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpf8h\" (UniqueName: \"kubernetes.io/projected/85251bca-2387-47f1-892a-cf015be5673d-kube-api-access-jpf8h\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771090 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-node-log\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771113 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-ovnkube-config\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771136 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-sys\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771230 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73847dfb-9da7-48a8-9c86-58744827d1a8-tmp\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771286 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-registration-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771365 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4h9j\" (UniqueName: \"kubernetes.io/projected/5c94e060-29ca-49bd-9d62-210b4628adef-kube-api-access-g4h9j\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771398 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-cnibin\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.771593 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771473 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771530 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-systemd-units\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771572 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02718710-e78f-45e5-97ee-f802acc6c063-ovn-node-metrics-cert\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771619 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysctl-d\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771677 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-system-cni-dir\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771756 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-slash\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771782 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.771817 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtl9x\" (UniqueName: \"kubernetes.io/projected/73847dfb-9da7-48a8-9c86-58744827d1a8-kube-api-access-rtl9x\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.772350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.772351 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:02:39.772774 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.772359 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6lrwt\"" Apr 17 08:02:39.772774 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.772662 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 08:02:39.772774 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.772666 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 08:02:39.794088 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.794051 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:57:38 +0000 UTC" deadline="2027-12-05 08:29:12.110581239 +0000 UTC" Apr 17 08:02:39.794088 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.794086 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14328h26m32.316499512s" Apr 17 08:02:39.859542 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.859513 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 08:02:39.870146 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.870072 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" event={"ID":"bd86dab92626f6c7dc59a5d2e42a1f67","Type":"ContainerStarted","Data":"85dec64e67ba1deaa90eeefe12e5cadf7dcaa2df5352652e1ffaaa716704a525"} Apr 17 08:02:39.871619 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.871587 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" event={"ID":"d0c56bbacc967c418e4c91e68e0ba0d3","Type":"ContainerStarted","Data":"b3da5f6a89cf7e5c43e74fa096600ea9fc3530d680eaa8e28b36b78249a63a80"} Apr 17 08:02:39.872027 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872003 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-cni-binary-copy\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.872126 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872041 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-netns\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.872126 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02718710-e78f-45e5-97ee-f802acc6c063-ovn-node-metrics-cert\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.872215 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872198 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysctl-d\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.872392 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872369 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysctl-d\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.872392 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-kubelet\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.872553 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872413 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-hostroot\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.872553 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872431 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 08:02:39.872553 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872452 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.872553 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtxv\" (UniqueName: \"kubernetes.io/projected/02718710-e78f-45e5-97ee-f802acc6c063-kube-api-access-zvtxv\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.872553 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872543 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-systemd\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872567 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-lib-modules\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872596 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872624 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-cni-multus\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872651 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-host-slash\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872677 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-kubelet\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872700 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-env-overrides\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-os-release\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872751 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62832cab-fee8-498a-b9dd-d410e9f3e921-tmp-dir\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.872795 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872778 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872806 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-log-socket\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-kubelet\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872844 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-modprobe-d\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872845 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872513 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872870 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysconfig\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872946 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-run\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.872971 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36159bce-5e37-46a6-b216-a4bc0a7e38a8-agent-certs\") pod \"konnectivity-agent-ttxzs\" (UID: \"36159bce-5e37-46a6-b216-a4bc0a7e38a8\") " pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873002 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-os-release\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873016 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-cnibin\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873046 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-cni-bin\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873113 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-run-netns\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873156 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-device-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873182 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873226 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62832cab-fee8-498a-b9dd-d410e9f3e921-hosts-file\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d06f7085-abd8-4770-bc98-8794c5f1a056-serviceca\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.874526 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873306 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-etc-kubernetes\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873329 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-host\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873360 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873399 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5tkv\" (UniqueName: \"kubernetes.io/projected/62832cab-fee8-498a-b9dd-d410e9f3e921-kube-api-access-m5tkv\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873414 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-env-overrides\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873426 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-ovnkube-config\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873501 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-host\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873517 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysconfig\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873572 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-run-netns\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873590 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-lib-modules\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873591 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873607 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-modprobe-d\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873610 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-log-socket\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873636 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-run\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873653 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-device-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873729 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873769 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flzxv\" (UniqueName: \"kubernetes.io/projected/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-kube-api-access-flzxv\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.875431 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873792 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-systemd\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh5qx\" (UniqueName: \"kubernetes.io/projected/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-kube-api-access-qh5qx\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.873961 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874002 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-systemd-units\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874032 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-system-cni-dir\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874060 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-socket-dir-parent\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874088 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-slash\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874100 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-systemd-units\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874111 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtl9x\" (UniqueName: \"kubernetes.io/projected/73847dfb-9da7-48a8-9c86-58744827d1a8-kube-api-access-rtl9x\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874136 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-multus-certs\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874153 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-system-cni-dir\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874158 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-iptables-alerter-script\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874162 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-ovnkube-config\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874184 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-systemd\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874208 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-var-lib-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874233 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.878481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874244 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-systemd\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874244 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-slash\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874257 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-run-ovn-kubernetes\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874281 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-var-lib-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874289 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874282 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-cni-bin\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-run-ovn-kubernetes\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874326 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-cni-bin\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874320 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874365 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqt6p\" (UniqueName: \"kubernetes.io/projected/18d4abe2-95b8-4158-acde-3d01b4526f60-kube-api-access-vqt6p\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:39.874401 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874435 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-etc-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874400 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-etc-openvswitch\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:39.874466 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:02:40.37444473 +0000 UTC m=+3.072401374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874556 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-system-cni-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874586 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-conf-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874615 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-ovn\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.879317 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874648 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-cni-netd\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874676 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36159bce-5e37-46a6-b216-a4bc0a7e38a8-konnectivity-ca\") pod \"konnectivity-agent-ttxzs\" (UID: \"36159bce-5e37-46a6-b216-a4bc0a7e38a8\") " pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874703 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxpnq\" (UniqueName: \"kubernetes.io/projected/d06f7085-abd8-4770-bc98-8794c5f1a056-kube-api-access-fxpnq\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874720 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-host-cni-netd\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874728 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-cni-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874677 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-run-ovn\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874752 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-k8s-cni-cncf-io\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874776 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-daemon-config\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874801 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-var-lib-kubelet\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-tuned\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874848 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874873 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d06f7085-abd8-4770-bc98-8794c5f1a056-host\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-var-lib-kubelet\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-ovnkube-script-lib\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874941 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-kubernetes\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874968 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysctl-conf\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.874990 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-socket-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.880165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875013 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-sys-fs\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875038 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpf8h\" (UniqueName: \"kubernetes.io/projected/85251bca-2387-47f1-892a-cf015be5673d-kube-api-access-jpf8h\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875063 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-cnibin\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875089 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875117 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-node-log\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875142 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-sys\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875167 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73847dfb-9da7-48a8-9c86-58744827d1a8-tmp\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875190 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36159bce-5e37-46a6-b216-a4bc0a7e38a8-konnectivity-ca\") pod \"konnectivity-agent-ttxzs\" (UID: \"36159bce-5e37-46a6-b216-a4bc0a7e38a8\") " pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-registration-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4h9j\" (UniqueName: \"kubernetes.io/projected/5c94e060-29ca-49bd-9d62-210b4628adef-kube-api-access-g4h9j\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-registration-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875257 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-os-release\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875321 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02718710-e78f-45e5-97ee-f802acc6c063-node-log\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875371 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-sys\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875414 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02718710-e78f-45e5-97ee-f802acc6c063-ovnkube-script-lib\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875546 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875578 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-sys-fs\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.880940 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-sysctl-conf\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875648 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85251bca-2387-47f1-892a-cf015be5673d-socket-dir\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18d4abe2-95b8-4158-acde-3d01b4526f60-cnibin\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.875734 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-kubernetes\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.876188 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18d4abe2-95b8-4158-acde-3d01b4526f60-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.878093 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73847dfb-9da7-48a8-9c86-58744827d1a8-etc-tuned\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.878186 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02718710-e78f-45e5-97ee-f802acc6c063-ovn-node-metrics-cert\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.878401 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36159bce-5e37-46a6-b216-a4bc0a7e38a8-agent-certs\") pod \"konnectivity-agent-ttxzs\" (UID: \"36159bce-5e37-46a6-b216-a4bc0a7e38a8\") " pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.880556 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73847dfb-9da7-48a8-9c86-58744827d1a8-tmp\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.882329 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.880614 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtxv\" (UniqueName: \"kubernetes.io/projected/02718710-e78f-45e5-97ee-f802acc6c063-kube-api-access-zvtxv\") pod \"ovnkube-node-wwcgr\" (UID: \"02718710-e78f-45e5-97ee-f802acc6c063\") " pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:39.884117 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.884095 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtl9x\" (UniqueName: \"kubernetes.io/projected/73847dfb-9da7-48a8-9c86-58744827d1a8-kube-api-access-rtl9x\") pod \"tuned-clh84\" (UID: \"73847dfb-9da7-48a8-9c86-58744827d1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:39.887998 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.887969 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqt6p\" (UniqueName: \"kubernetes.io/projected/18d4abe2-95b8-4158-acde-3d01b4526f60-kube-api-access-vqt6p\") pod \"multus-additional-cni-plugins-q7p6p\" (UID: \"18d4abe2-95b8-4158-acde-3d01b4526f60\") " pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:39.888284 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.888245 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4h9j\" (UniqueName: \"kubernetes.io/projected/5c94e060-29ca-49bd-9d62-210b4628adef-kube-api-access-g4h9j\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:39.888424 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.888406 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpf8h\" (UniqueName: \"kubernetes.io/projected/85251bca-2387-47f1-892a-cf015be5673d-kube-api-access-jpf8h\") pod \"aws-ebs-csi-driver-node-5vvsg\" (UID: \"85251bca-2387-47f1-892a-cf015be5673d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:39.975815 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-system-cni-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.975815 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-conf-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.975815 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975802 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxpnq\" (UniqueName: \"kubernetes.io/projected/d06f7085-abd8-4770-bc98-8794c5f1a056-kube-api-access-fxpnq\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-cni-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975858 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-k8s-cni-cncf-io\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-daemon-config\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d06f7085-abd8-4770-bc98-8794c5f1a056-host\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975905 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-system-cni-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.975972 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-os-release\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-cni-binary-copy\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976005 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-k8s-cni-cncf-io\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-netns\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976049 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-kubelet\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976106 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976071 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-cni-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976607 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976074 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-hostroot\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976607 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976164 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-cni-multus\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976607 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976170 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-os-release\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976607 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-host-slash\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.976607 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976238 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-hostroot\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976607 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976248 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62832cab-fee8-498a-b9dd-d410e9f3e921-tmp-dir\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.976607 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976275 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:39.976935 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976648 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-cnibin\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976935 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-cni-bin\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976935 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62832cab-fee8-498a-b9dd-d410e9f3e921-hosts-file\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.976935 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976774 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d06f7085-abd8-4770-bc98-8794c5f1a056-serviceca\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.976935 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976806 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-etc-kubernetes\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976935 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976844 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5tkv\" (UniqueName: \"kubernetes.io/projected/62832cab-fee8-498a-b9dd-d410e9f3e921-kube-api-access-m5tkv\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.976935 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976882 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flzxv\" (UniqueName: \"kubernetes.io/projected/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-kube-api-access-flzxv\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.976935 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh5qx\" (UniqueName: \"kubernetes.io/projected/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-kube-api-access-qh5qx\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.977285 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976969 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-socket-dir-parent\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.977285 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977006 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-multus-certs\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.977285 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977042 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-iptables-alerter-script\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.977285 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977158 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-cni-binary-copy\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.977285 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977184 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-daemon-config\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.977285 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977257 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d06f7085-abd8-4770-bc98-8794c5f1a056-host\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.977658 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977635 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-host-slash\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.977713 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.976115 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-conf-dir\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.977713 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977685 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-iptables-alerter-script\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.977805 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977731 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-cni-multus\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.977805 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977318 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-kubelet\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.977805 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977798 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-netns\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.977996 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977870 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62832cab-fee8-498a-b9dd-d410e9f3e921-hosts-file\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.977996 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.977939 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-var-lib-cni-bin\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.978099 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.978003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-etc-kubernetes\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.978099 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.978044 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62832cab-fee8-498a-b9dd-d410e9f3e921-tmp-dir\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.978099 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.978070 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-multus-socket-dir-parent\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.978234 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.978096 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-host-run-multus-certs\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.978288 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.978242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-cnibin\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:39.978390 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.978367 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d06f7085-abd8-4770-bc98-8794c5f1a056-serviceca\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.994010 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.993974 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxpnq\" (UniqueName: \"kubernetes.io/projected/d06f7085-abd8-4770-bc98-8794c5f1a056-kube-api-access-fxpnq\") pod \"node-ca-lf6p6\" (UID: \"d06f7085-abd8-4770-bc98-8794c5f1a056\") " pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:39.994750 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:39.994718 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:39.994750 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:39.994745 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:39.994901 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:39.994762 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9l49n for pod openshift-network-diagnostics/network-check-target-bhnbj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:39.994901 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:39.994832 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n podName:2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:40.494813151 +0000 UTC m=+3.192769807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9l49n" (UniqueName: "kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n") pod "network-check-target-bhnbj" (UID: "2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:39.997735 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.997327 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5tkv\" (UniqueName: \"kubernetes.io/projected/62832cab-fee8-498a-b9dd-d410e9f3e921-kube-api-access-m5tkv\") pod \"node-resolver-sv4zj\" (UID: \"62832cab-fee8-498a-b9dd-d410e9f3e921\") " pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:39.997735 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.997697 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh5qx\" (UniqueName: \"kubernetes.io/projected/e95c47f8-d745-42a4-8a4b-3f83ef6805b8-kube-api-access-qh5qx\") pod \"iptables-alerter-pr74b\" (UID: \"e95c47f8-d745-42a4-8a4b-3f83ef6805b8\") " pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:39.997735 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:39.997704 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flzxv\" (UniqueName: \"kubernetes.io/projected/abf436c1-3b8e-4c83-b4e5-4cae8c04c259-kube-api-access-flzxv\") pod \"multus-r995s\" (UID: \"abf436c1-3b8e-4c83-b4e5-4cae8c04c259\") " pod="openshift-multus/multus-r995s" Apr 17 08:02:40.068698 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.068656 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" Apr 17 08:02:40.077319 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.077165 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:02:40.084956 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.084853 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6zncl"] Apr 17 08:02:40.086072 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.086050 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-clh84" Apr 17 08:02:40.089399 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.089375 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.089527 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.089460 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:40.093300 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.093275 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:40.101200 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.101160 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" Apr 17 08:02:40.113018 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.112983 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sv4zj" Apr 17 08:02:40.118801 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.118771 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lf6p6" Apr 17 08:02:40.126644 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.126614 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r995s" Apr 17 08:02:40.133412 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.133385 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pr74b" Apr 17 08:02:40.178696 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.178658 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28077202-06dd-4ed2-862a-f70c6f35f820-kubelet-config\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.178942 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.178705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.178942 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.178794 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28077202-06dd-4ed2-862a-f70c6f35f820-dbus\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.279464 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.279388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28077202-06dd-4ed2-862a-f70c6f35f820-dbus\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.279464 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.279451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28077202-06dd-4ed2-862a-f70c6f35f820-kubelet-config\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.279699 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.279491 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.279699 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.279602 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28077202-06dd-4ed2-862a-f70c6f35f820-dbus\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.279699 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.279608 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28077202-06dd-4ed2-862a-f70c6f35f820-kubelet-config\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.279699 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.279616 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:40.279699 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.279700 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret podName:28077202-06dd-4ed2-862a-f70c6f35f820 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:40.779682365 +0000 UTC m=+3.477639008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret") pod "global-pull-secret-syncer-6zncl" (UID: "28077202-06dd-4ed2-862a-f70c6f35f820") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:40.380108 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.379815 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:40.380271 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.379942 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:40.380271 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.380241 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:02:41.380219553 +0000 UTC m=+4.078176200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:40.581933 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.581823 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:40.582118 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.582016 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:40.582118 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.582040 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:40.582118 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.582053 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9l49n for pod openshift-network-diagnostics/network-check-target-bhnbj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:40.582118 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.582117 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n podName:2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:41.582098792 +0000 UTC m=+4.280055439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9l49n" (UniqueName: "kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n") pod "network-check-target-bhnbj" (UID: "2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:40.741234 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.741200 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18d4abe2_95b8_4158_acde_3d01b4526f60.slice/crio-71b0aca4289757609cf50e4f2c38d58254f16c66cc9746d095fd8ca5eaea50d5 WatchSource:0}: Error finding container 71b0aca4289757609cf50e4f2c38d58254f16c66cc9746d095fd8ca5eaea50d5: Status 404 returned error can't find the container with id 71b0aca4289757609cf50e4f2c38d58254f16c66cc9746d095fd8ca5eaea50d5 Apr 17 08:02:40.742635 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.742524 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62832cab_fee8_498a_b9dd_d410e9f3e921.slice/crio-98204c164a70950d1277a7dfd0f185b86d54f64ab3b1013470be3e07d37f0ac7 WatchSource:0}: Error finding container 98204c164a70950d1277a7dfd0f185b86d54f64ab3b1013470be3e07d37f0ac7: Status 404 returned error can't find the container with id 98204c164a70950d1277a7dfd0f185b86d54f64ab3b1013470be3e07d37f0ac7 Apr 17 08:02:40.745808 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.745778 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85251bca_2387_47f1_892a_cf015be5673d.slice/crio-fa079af7691f37acaa07a2096a8cffb8b9c41452a02c5b42233c54c3c41e90ca WatchSource:0}: Error finding container fa079af7691f37acaa07a2096a8cffb8b9c41452a02c5b42233c54c3c41e90ca: Status 404 returned error can't find the container with id fa079af7691f37acaa07a2096a8cffb8b9c41452a02c5b42233c54c3c41e90ca Apr 17 08:02:40.746677 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.746630 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73847dfb_9da7_48a8_9c86_58744827d1a8.slice/crio-44e37727cbe6d96d57c919a62b7f238f220ce973cd2f37c5cbc0d3ada6fa230f WatchSource:0}: Error finding container 44e37727cbe6d96d57c919a62b7f238f220ce973cd2f37c5cbc0d3ada6fa230f: Status 404 returned error can't find the container with id 44e37727cbe6d96d57c919a62b7f238f220ce973cd2f37c5cbc0d3ada6fa230f Apr 17 08:02:40.747406 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.747388 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd06f7085_abd8_4770_bc98_8794c5f1a056.slice/crio-f1a169a1eef4ea1e6b100b05df7c49a32b12fb0eab3b7ea654b05586c5af9a01 WatchSource:0}: Error finding container f1a169a1eef4ea1e6b100b05df7c49a32b12fb0eab3b7ea654b05586c5af9a01: Status 404 returned error can't find the container with id f1a169a1eef4ea1e6b100b05df7c49a32b12fb0eab3b7ea654b05586c5af9a01 Apr 17 08:02:40.749350 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.749313 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode95c47f8_d745_42a4_8a4b_3f83ef6805b8.slice/crio-07fc22ff69349be9107c620210876384421159492500dddef6ac258ac2928119 WatchSource:0}: Error finding container 07fc22ff69349be9107c620210876384421159492500dddef6ac258ac2928119: Status 404 returned error can't find the container with id 07fc22ff69349be9107c620210876384421159492500dddef6ac258ac2928119 Apr 17 08:02:40.750413 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.750380 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02718710_e78f_45e5_97ee_f802acc6c063.slice/crio-6ebf15b6913cd37ae10f28f9106d0186116510be3e869a166870a1d16d3c4818 WatchSource:0}: Error finding container 6ebf15b6913cd37ae10f28f9106d0186116510be3e869a166870a1d16d3c4818: Status 404 returned error can't find the container with id 6ebf15b6913cd37ae10f28f9106d0186116510be3e869a166870a1d16d3c4818 Apr 17 08:02:40.751545 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.751514 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36159bce_5e37_46a6_b216_a4bc0a7e38a8.slice/crio-e8ff63fa0e1cef5901071c75c86f452b246216d8c00b0fbcfebd5aeeae953c6f WatchSource:0}: Error finding container e8ff63fa0e1cef5901071c75c86f452b246216d8c00b0fbcfebd5aeeae953c6f: Status 404 returned error can't find the container with id e8ff63fa0e1cef5901071c75c86f452b246216d8c00b0fbcfebd5aeeae953c6f Apr 17 08:02:40.753029 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:02:40.752467 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabf436c1_3b8e_4c83_b4e5_4cae8c04c259.slice/crio-e06fe2f1396d65742261d10a71dec38911484b57f43d9594ede4e02d5c3e80d1 WatchSource:0}: Error finding container e06fe2f1396d65742261d10a71dec38911484b57f43d9594ede4e02d5c3e80d1: Status 404 returned error can't find the container with id e06fe2f1396d65742261d10a71dec38911484b57f43d9594ede4e02d5c3e80d1 Apr 17 08:02:40.783205 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.783176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:40.783371 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.783290 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:40.783371 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:40.783345 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret podName:28077202-06dd-4ed2-862a-f70c6f35f820 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:41.783329004 +0000 UTC m=+4.481285647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret") pod "global-pull-secret-syncer-6zncl" (UID: "28077202-06dd-4ed2-862a-f70c6f35f820") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:40.794623 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.794582 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:57:38 +0000 UTC" deadline="2027-12-20 17:28:07.078225308 +0000 UTC" Apr 17 08:02:40.794623 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.794618 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14697h25m26.283610197s" Apr 17 08:02:40.874215 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.874171 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r995s" event={"ID":"abf436c1-3b8e-4c83-b4e5-4cae8c04c259","Type":"ContainerStarted","Data":"e06fe2f1396d65742261d10a71dec38911484b57f43d9594ede4e02d5c3e80d1"} Apr 17 08:02:40.875279 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.875246 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"6ebf15b6913cd37ae10f28f9106d0186116510be3e869a166870a1d16d3c4818"} Apr 17 08:02:40.876263 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.876231 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-clh84" event={"ID":"73847dfb-9da7-48a8-9c86-58744827d1a8","Type":"ContainerStarted","Data":"44e37727cbe6d96d57c919a62b7f238f220ce973cd2f37c5cbc0d3ada6fa230f"} Apr 17 08:02:40.877265 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.877230 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sv4zj" event={"ID":"62832cab-fee8-498a-b9dd-d410e9f3e921","Type":"ContainerStarted","Data":"98204c164a70950d1277a7dfd0f185b86d54f64ab3b1013470be3e07d37f0ac7"} Apr 17 08:02:40.878657 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.878632 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" event={"ID":"d0c56bbacc967c418e4c91e68e0ba0d3","Type":"ContainerStarted","Data":"5608708a167426b5236bca8b36b2e9810938557243426be4c35976f6aec8c7b7"} Apr 17 08:02:40.879691 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.879668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ttxzs" event={"ID":"36159bce-5e37-46a6-b216-a4bc0a7e38a8","Type":"ContainerStarted","Data":"e8ff63fa0e1cef5901071c75c86f452b246216d8c00b0fbcfebd5aeeae953c6f"} Apr 17 08:02:40.880671 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.880650 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pr74b" event={"ID":"e95c47f8-d745-42a4-8a4b-3f83ef6805b8","Type":"ContainerStarted","Data":"07fc22ff69349be9107c620210876384421159492500dddef6ac258ac2928119"} Apr 17 08:02:40.883745 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.883716 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lf6p6" event={"ID":"d06f7085-abd8-4770-bc98-8794c5f1a056","Type":"ContainerStarted","Data":"f1a169a1eef4ea1e6b100b05df7c49a32b12fb0eab3b7ea654b05586c5af9a01"} Apr 17 08:02:40.884693 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.884665 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" event={"ID":"85251bca-2387-47f1-892a-cf015be5673d","Type":"ContainerStarted","Data":"fa079af7691f37acaa07a2096a8cffb8b9c41452a02c5b42233c54c3c41e90ca"} Apr 17 08:02:40.885571 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.885542 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerStarted","Data":"71b0aca4289757609cf50e4f2c38d58254f16c66cc9746d095fd8ca5eaea50d5"} Apr 17 08:02:40.892343 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:40.892278 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-245.ec2.internal" podStartSLOduration=2.892262102 podStartE2EDuration="2.892262102s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:02:40.891575366 +0000 UTC m=+3.589532032" watchObservedRunningTime="2026-04-17 08:02:40.892262102 +0000 UTC m=+3.590218768" Apr 17 08:02:41.387229 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:41.386631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:41.387229 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.386788 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:41.387229 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.386857 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:02:43.386836628 +0000 UTC m=+6.084793289 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:41.588335 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:41.588241 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:41.588494 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.588424 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:41.588494 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.588445 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:41.588494 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.588457 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9l49n for pod openshift-network-diagnostics/network-check-target-bhnbj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:41.588658 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.588523 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n podName:2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:43.58850288 +0000 UTC m=+6.286459527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9l49n" (UniqueName: "kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n") pod "network-check-target-bhnbj" (UID: "2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:41.793344 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:41.792718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:41.793344 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.792903 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:41.793344 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.792987 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret podName:28077202-06dd-4ed2-862a-f70c6f35f820 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:43.792968208 +0000 UTC m=+6.490924858 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret") pod "global-pull-secret-syncer-6zncl" (UID: "28077202-06dd-4ed2-862a-f70c6f35f820") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:41.868087 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:41.868003 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:41.868253 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.868144 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:41.868632 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:41.868606 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:41.868738 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.868718 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:41.868800 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:41.868784 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:41.868876 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:41.868856 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:41.904316 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:41.903956 2580 generic.go:358] "Generic (PLEG): container finished" podID="bd86dab92626f6c7dc59a5d2e42a1f67" containerID="51179bc7e451155ffcb8cdf76797e8a43767661de3752bbf3b6767f73f76fce5" exitCode=0 Apr 17 08:02:41.904316 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:41.904090 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" event={"ID":"bd86dab92626f6c7dc59a5d2e42a1f67","Type":"ContainerDied","Data":"51179bc7e451155ffcb8cdf76797e8a43767661de3752bbf3b6767f73f76fce5"} Apr 17 08:02:42.923123 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:42.922338 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" event={"ID":"bd86dab92626f6c7dc59a5d2e42a1f67","Type":"ContainerStarted","Data":"a7d5c24e39510583c76236249ac7ed881d1811aedf65ce170979380345840c75"} Apr 17 08:02:42.936238 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:42.935647 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-245.ec2.internal" podStartSLOduration=4.935624386 podStartE2EDuration="4.935624386s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:02:42.935264024 +0000 UTC m=+5.633220718" watchObservedRunningTime="2026-04-17 08:02:42.935624386 +0000 UTC m=+5.633581053" Apr 17 08:02:43.406126 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:43.406034 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:43.406303 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.406201 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:43.406303 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.406272 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:02:47.406251811 +0000 UTC m=+10.104208458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:43.608494 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:43.608453 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:43.608682 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.608620 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:43.608682 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.608639 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:43.608682 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.608652 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9l49n for pod openshift-network-diagnostics/network-check-target-bhnbj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:43.608845 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.608719 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n podName:2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:47.608698972 +0000 UTC m=+10.306655638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9l49n" (UniqueName: "kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n") pod "network-check-target-bhnbj" (UID: "2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:43.811152 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:43.810476 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:43.811152 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.810624 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:43.811152 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.810697 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret podName:28077202-06dd-4ed2-862a-f70c6f35f820 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:47.810675473 +0000 UTC m=+10.508632134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret") pod "global-pull-secret-syncer-6zncl" (UID: "28077202-06dd-4ed2-862a-f70c6f35f820") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:43.866270 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:43.866088 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:43.866270 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.866225 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:43.866693 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:43.866675 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:43.866810 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.866788 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:43.866979 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:43.866964 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:43.867093 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:43.867048 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:45.866153 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:45.865618 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:45.866153 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:45.865763 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:45.866153 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:45.865875 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:45.866153 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:45.866005 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:45.866153 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:45.866041 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:45.866153 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:45.866115 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:47.440813 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:47.440769 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:47.441308 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.440983 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:47.441308 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.441068 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:02:55.441046281 +0000 UTC m=+18.139002967 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:47.642779 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:47.642734 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:47.643001 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.642945 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:47.643001 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.642971 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:47.643001 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.642986 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9l49n for pod openshift-network-diagnostics/network-check-target-bhnbj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:47.643162 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.643052 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n podName:2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:55.643029989 +0000 UTC m=+18.340986635 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9l49n" (UniqueName: "kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n") pod "network-check-target-bhnbj" (UID: "2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:47.844832 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:47.844796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:47.844989 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.844934 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:47.845062 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.844997 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret podName:28077202-06dd-4ed2-862a-f70c6f35f820 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:55.844983729 +0000 UTC m=+18.542940375 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret") pod "global-pull-secret-syncer-6zncl" (UID: "28077202-06dd-4ed2-862a-f70c6f35f820") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:47.866749 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:47.866713 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:47.866932 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.866848 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:47.866986 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:47.866952 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:47.867118 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.867093 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:47.867488 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:47.867455 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:47.867628 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:47.867608 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:49.867095 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.866211 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:49.867095 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:49.866597 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:49.867095 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.866603 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:49.867095 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:49.866758 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:49.867095 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.866883 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:49.867095 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:49.867036 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:49.940542 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.940391 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-clh84" event={"ID":"73847dfb-9da7-48a8-9c86-58744827d1a8","Type":"ContainerStarted","Data":"0561f36a9d49160b0af3d98e7d9d56e8f5202cf66f9bbcf0dac7f79c2e490f2a"} Apr 17 08:02:49.943832 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.943776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sv4zj" event={"ID":"62832cab-fee8-498a-b9dd-d410e9f3e921","Type":"ContainerStarted","Data":"848bea67b013845a3c8815f16d61d1a8a8f488704d2e2d7f64e458a649a09e79"} Apr 17 08:02:49.946115 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.946067 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ttxzs" event={"ID":"36159bce-5e37-46a6-b216-a4bc0a7e38a8","Type":"ContainerStarted","Data":"5973ab29638aab92940153ec353772c5f623444e3464286a7d98bdae4d8ba481"} Apr 17 08:02:49.947741 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.947688 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lf6p6" event={"ID":"d06f7085-abd8-4770-bc98-8794c5f1a056","Type":"ContainerStarted","Data":"9afae2e03c20f101853f9576037d6035fa28f757a09c9d08721ab34feae59cb7"} Apr 17 08:02:49.950670 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.950629 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" event={"ID":"85251bca-2387-47f1-892a-cf015be5673d","Type":"ContainerStarted","Data":"29594dd0ff7c3410ca1e3509e24d51023cad16b65878c38d8e51ae904904276e"} Apr 17 08:02:49.952417 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.952383 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerStarted","Data":"445e2fa547806500937efc882f2ad984dc551da2463c61130746a2fff65eb5b1"} Apr 17 08:02:49.954906 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.954865 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-clh84" podStartSLOduration=3.106459017 podStartE2EDuration="11.954839251s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.748404585 +0000 UTC m=+3.446361244" lastFinishedPulling="2026-04-17 08:02:49.596784828 +0000 UTC m=+12.294741478" observedRunningTime="2026-04-17 08:02:49.953233331 +0000 UTC m=+12.651189997" watchObservedRunningTime="2026-04-17 08:02:49.954839251 +0000 UTC m=+12.652795916" Apr 17 08:02:49.977321 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.977267 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ttxzs" podStartSLOduration=3.150064161 podStartE2EDuration="11.977249238s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.754936737 +0000 UTC m=+3.452893385" lastFinishedPulling="2026-04-17 08:02:49.582121818 +0000 UTC m=+12.280078462" observedRunningTime="2026-04-17 08:02:49.964818459 +0000 UTC m=+12.662775124" watchObservedRunningTime="2026-04-17 08:02:49.977249238 +0000 UTC m=+12.675205905" Apr 17 08:02:49.992564 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.992500 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sv4zj" podStartSLOduration=3.148113366 podStartE2EDuration="11.992480105s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.744630229 +0000 UTC m=+3.442586874" lastFinishedPulling="2026-04-17 08:02:49.588996955 +0000 UTC m=+12.286953613" observedRunningTime="2026-04-17 08:02:49.976638561 +0000 UTC m=+12.674595227" watchObservedRunningTime="2026-04-17 08:02:49.992480105 +0000 UTC m=+12.690436773" Apr 17 08:02:49.992751 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:49.992599 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lf6p6" podStartSLOduration=3.170654768 podStartE2EDuration="11.992593842s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.749317211 +0000 UTC m=+3.447273859" lastFinishedPulling="2026-04-17 08:02:49.571256269 +0000 UTC m=+12.269212933" observedRunningTime="2026-04-17 08:02:49.99228566 +0000 UTC m=+12.690242327" watchObservedRunningTime="2026-04-17 08:02:49.992593842 +0000 UTC m=+12.690550511" Apr 17 08:02:50.298896 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:50.298860 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:50.955258 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:50.955173 2580 generic.go:358] "Generic (PLEG): container finished" podID="18d4abe2-95b8-4158-acde-3d01b4526f60" containerID="445e2fa547806500937efc882f2ad984dc551da2463c61130746a2fff65eb5b1" exitCode=0 Apr 17 08:02:50.955951 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:50.955298 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerDied","Data":"445e2fa547806500937efc882f2ad984dc551da2463c61130746a2fff65eb5b1"} Apr 17 08:02:51.866577 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:51.866225 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:51.866577 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:51.866260 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:51.866577 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:51.866226 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:51.866577 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:51.866361 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:51.866577 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:51.866438 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:51.866577 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:51.866523 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:51.958724 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:51.958680 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pr74b" event={"ID":"e95c47f8-d745-42a4-8a4b-3f83ef6805b8","Type":"ContainerStarted","Data":"cff8a7ad8a5ba13dd294e563323bae242ff9bbe51ccf3d821ace452d7f1f79f2"} Apr 17 08:02:51.969742 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:51.969687 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pr74b" podStartSLOduration=5.138028817 podStartE2EDuration="13.969669636s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.75167032 +0000 UTC m=+3.449626970" lastFinishedPulling="2026-04-17 08:02:49.583311129 +0000 UTC m=+12.281267789" observedRunningTime="2026-04-17 08:02:51.969539896 +0000 UTC m=+14.667496564" watchObservedRunningTime="2026-04-17 08:02:51.969669636 +0000 UTC m=+14.667626302" Apr 17 08:02:52.519517 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:52.519476 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:52.520234 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:52.520208 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:52.961805 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:52.961775 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ttxzs" Apr 17 08:02:53.866425 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:53.866377 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:53.866619 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:53.866514 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:53.866619 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:53.866559 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:53.866746 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:53.866690 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:53.866802 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:53.866765 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:53.866865 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:53.866845 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:55.511177 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:55.510926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:55.511790 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.511071 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:55.511790 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.511298 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:03:11.51127954 +0000 UTC m=+34.209236211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:55.712644 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:55.712603 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:55.712816 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.712769 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:55.712816 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.712787 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:55.712816 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.712798 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9l49n for pod openshift-network-diagnostics/network-check-target-bhnbj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:55.713014 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.712856 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n podName:2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:11.712839062 +0000 UTC m=+34.410795722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9l49n" (UniqueName: "kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n") pod "network-check-target-bhnbj" (UID: "2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:55.866561 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:55.866443 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:55.866561 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:55.866449 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:55.866759 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.866577 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:55.866759 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.866686 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:55.866759 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:55.866449 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:55.866874 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.866783 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:55.914722 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:55.914681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:55.914889 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.914807 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:55.914996 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:55.914895 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret podName:28077202-06dd-4ed2-862a-f70c6f35f820 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:11.914873121 +0000 UTC m=+34.612829778 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret") pod "global-pull-secret-syncer-6zncl" (UID: "28077202-06dd-4ed2-862a-f70c6f35f820") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:57.867482 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:57.867440 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:57.867954 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:57.867498 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:57.867954 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:57.867557 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:57.867954 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:57.867613 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:57.867954 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:57.867662 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:02:57.867954 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:57.867682 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:59.865975 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:59.865937 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:02:59.865975 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:59.865966 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:02:59.866551 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:02:59.865937 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:02:59.866551 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:59.866077 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:02:59.866551 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:59.866206 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:02:59.866551 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:02:59.866284 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:03:01.202201 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.201850 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 08:03:01.839013 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.838635 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T08:03:01.20187377Z","UUID":"c519fec6-65c1-4116-8885-c1b2ea02ec94","Handler":null,"Name":"","Endpoint":""} Apr 17 08:03:01.840404 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.840382 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 08:03:01.840548 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.840414 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 08:03:01.865529 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.865431 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:01.865681 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.865567 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:01.865681 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.865431 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:01.865681 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:01.865564 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:03:01.865847 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:01.865685 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:03:01.865847 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:01.865794 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:03:01.977477 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.977442 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" event={"ID":"85251bca-2387-47f1-892a-cf015be5673d","Type":"ContainerStarted","Data":"f4f0ad5dd566385293439c56b7ab5fca1e8566975f9e72c8fc6127b313a805d2"} Apr 17 08:03:01.978984 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.978956 2580 generic.go:358] "Generic (PLEG): container finished" podID="18d4abe2-95b8-4158-acde-3d01b4526f60" containerID="a87cf69a5a4a989c515ca3707ddceb915e2b0f39fb2b8444c98a2d626cf8a6f3" exitCode=0 Apr 17 08:03:01.979127 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.979044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerDied","Data":"a87cf69a5a4a989c515ca3707ddceb915e2b0f39fb2b8444c98a2d626cf8a6f3"} Apr 17 08:03:01.980438 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.980390 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r995s" event={"ID":"abf436c1-3b8e-4c83-b4e5-4cae8c04c259","Type":"ContainerStarted","Data":"6364caf8364a7274085a28012dc844615c710514b3a5b7d4c2695e0a7433275d"} Apr 17 08:03:01.983636 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.983608 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"4abec31d081aac024cd4c6ecef8dd9496816cac8613adfe6b702e722c7be04b8"} Apr 17 08:03:01.983749 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.983644 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"0023a7f02a881a4907d107d5e841a5ba2948960fc457e452973b551a02ed0e91"} Apr 17 08:03:01.983749 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.983659 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"7608d4fd653482f5200d430807823761770922e40af7cd79811207d064d919ce"} Apr 17 08:03:01.983749 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.983672 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"9544dba314a807a4908e8272296b2ae5e44e768f1a76872f9c693099f598b7b6"} Apr 17 08:03:01.983749 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.983683 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"a5c186792d2bf2dac0afa940681d0a5b21a4370635f039dad6615474ff4e26fa"} Apr 17 08:03:01.983749 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:01.983695 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"8d65d2f372fbc87a4b946eb01c2c367c89e15019ed7be317f4ff436145214243"} Apr 17 08:03:02.016546 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:02.016499 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r995s" podStartSLOduration=3.774167958 podStartE2EDuration="24.01648166s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.754803053 +0000 UTC m=+3.452759704" lastFinishedPulling="2026-04-17 08:03:00.997116754 +0000 UTC m=+23.695073406" observedRunningTime="2026-04-17 08:03:02.016077204 +0000 UTC m=+24.714033893" watchObservedRunningTime="2026-04-17 08:03:02.01648166 +0000 UTC m=+24.714438326" Apr 17 08:03:02.987352 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:02.987308 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" event={"ID":"85251bca-2387-47f1-892a-cf015be5673d","Type":"ContainerStarted","Data":"5b684543e0e73e5eda377184c13b0e4d5f1d5aca48f8f4a845c31aa3ef847355"} Apr 17 08:03:03.004007 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:03.003948 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vvsg" podStartSLOduration=3.471994394 podStartE2EDuration="25.003931942s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.747738057 +0000 UTC m=+3.445694707" lastFinishedPulling="2026-04-17 08:03:02.279675608 +0000 UTC m=+24.977632255" observedRunningTime="2026-04-17 08:03:03.003839975 +0000 UTC m=+25.701796641" watchObservedRunningTime="2026-04-17 08:03:03.003931942 +0000 UTC m=+25.701888603" Apr 17 08:03:03.865705 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:03.865666 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:03.865705 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:03.865688 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:03.865970 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:03.865714 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:03.865970 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:03.865814 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:03:03.865970 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:03.865937 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:03:03.866120 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:03.865990 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:03:03.991295 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:03.991202 2580 generic.go:358] "Generic (PLEG): container finished" podID="18d4abe2-95b8-4158-acde-3d01b4526f60" containerID="7282a0d8a791ee45de8e102aefecdd55aa5ae9e1ae2e9013fa78aa7371c1a382" exitCode=0 Apr 17 08:03:03.991295 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:03.991289 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerDied","Data":"7282a0d8a791ee45de8e102aefecdd55aa5ae9e1ae2e9013fa78aa7371c1a382"} Apr 17 08:03:03.994178 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:03.994147 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"dd85bc310f94ea7b320ece6cf116a415f7fec5b8e7b1f378c59b48d8e3a8ddf1"} Apr 17 08:03:05.866170 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:05.866009 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:05.866633 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:05.866016 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:05.866633 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:05.866257 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:03:05.866633 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:05.866016 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:05.866633 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:05.866320 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:03:05.866633 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:05.866388 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:03:06.000225 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:06.000187 2580 generic.go:358] "Generic (PLEG): container finished" podID="18d4abe2-95b8-4158-acde-3d01b4526f60" containerID="17b5096ed90d99c0faea7b9a2b33732739d1615fdaa60c33db532cb3c07e8fc0" exitCode=0 Apr 17 08:03:06.000423 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:06.000269 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerDied","Data":"17b5096ed90d99c0faea7b9a2b33732739d1615fdaa60c33db532cb3c07e8fc0"} Apr 17 08:03:06.003656 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:06.003619 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" event={"ID":"02718710-e78f-45e5-97ee-f802acc6c063","Type":"ContainerStarted","Data":"af48fdca79b3165f95cb4e89b23e9c7037f98a081704faf66658c7e0b20821a0"} Apr 17 08:03:06.003966 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:06.003946 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:03:06.004066 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:06.003973 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:03:06.019108 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:06.019079 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:03:06.019237 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:06.019151 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:03:06.046037 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:06.045984 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" podStartSLOduration=7.799353615 podStartE2EDuration="28.045969704s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.75263599 +0000 UTC m=+3.450592651" lastFinishedPulling="2026-04-17 08:03:00.999252089 +0000 UTC m=+23.697208740" observedRunningTime="2026-04-17 08:03:06.042500931 +0000 UTC m=+28.740457593" watchObservedRunningTime="2026-04-17 08:03:06.045969704 +0000 UTC m=+28.743926395" Apr 17 08:03:07.007079 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:07.007020 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 08:03:07.794443 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:07.794393 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bhnbj"] Apr 17 08:03:07.794634 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:07.794544 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:07.794692 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:07.794659 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:03:07.795301 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:07.795246 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6zncl"] Apr 17 08:03:07.795421 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:07.795378 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:07.795552 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:07.795510 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:03:07.796449 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:07.796424 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r9td5"] Apr 17 08:03:07.796556 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:07.796525 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:07.796656 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:07.796631 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:03:08.008343 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:08.008306 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 08:03:08.136184 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:08.136156 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:03:08.865639 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:08.865604 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:08.865787 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:08.865734 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:03:09.865784 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:09.865751 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:09.866258 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:09.865796 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:09.866258 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:09.865871 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:03:09.866384 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:09.866364 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:03:10.866364 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:10.866312 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:10.866806 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:10.866473 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:03:11.530805 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:11.530761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:11.530990 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.530964 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:03:11.531057 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.531044 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:03:43.531021505 +0000 UTC m=+66.228978151 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:03:11.732781 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:11.732732 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:11.732953 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.732931 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:03:11.732998 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.732958 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:03:11.732998 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.732969 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9l49n for pod openshift-network-diagnostics/network-check-target-bhnbj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:03:11.733093 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.733033 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n podName:2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:43.733015833 +0000 UTC m=+66.430972479 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9l49n" (UniqueName: "kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n") pod "network-check-target-bhnbj" (UID: "2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:03:11.866466 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:11.866371 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:11.866466 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:11.866408 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:11.866872 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.866517 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zncl" podUID="28077202-06dd-4ed2-862a-f70c6f35f820" Apr 17 08:03:11.866872 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.866639 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bhnbj" podUID="2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2" Apr 17 08:03:11.934033 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:11.933981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:11.934206 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.934152 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:03:11.934297 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:11.934233 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret podName:28077202-06dd-4ed2-862a-f70c6f35f820 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:43.934212399 +0000 UTC m=+66.632169044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret") pod "global-pull-secret-syncer-6zncl" (UID: "28077202-06dd-4ed2-862a-f70c6f35f820") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:03:12.866154 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:12.865960 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:12.866308 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:12.866279 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:03:13.620774 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.620750 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-245.ec2.internal" event="NodeReady" Apr 17 08:03:13.621341 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.620880 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 08:03:13.669375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.669340 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zbswg"] Apr 17 08:03:13.672368 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.672348 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gj4s7"] Apr 17 08:03:13.672537 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.672517 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.674553 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.674526 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wdhxs\"" Apr 17 08:03:13.674644 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.674572 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 08:03:13.674853 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.674836 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 08:03:13.675101 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.675084 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:13.676631 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.676609 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 08:03:13.676765 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.676748 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x9bwm\"" Apr 17 08:03:13.676765 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.676757 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 08:03:13.676859 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.676764 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 08:03:13.678339 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.678321 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gj4s7"] Apr 17 08:03:13.682081 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.682064 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zbswg"] Apr 17 08:03:13.848387 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.848355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e555087a-130b-4ab0-aaa8-92c983ed7e0b-config-volume\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.848564 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.848407 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.848564 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.848426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t75z4\" (UniqueName: \"kubernetes.io/projected/e1d27989-b735-4d7f-b801-7b81443d7ba7-kube-api-access-t75z4\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:13.848564 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.848456 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdc7c\" (UniqueName: \"kubernetes.io/projected/e555087a-130b-4ab0-aaa8-92c983ed7e0b-kube-api-access-gdc7c\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.848564 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.848479 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:13.848564 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.848496 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e555087a-130b-4ab0-aaa8-92c983ed7e0b-tmp-dir\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.866467 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.866444 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:13.866616 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.866597 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:13.871035 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.871018 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 08:03:13.871235 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.871224 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 08:03:13.871300 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.871282 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nfrwk\"" Apr 17 08:03:13.871863 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.871846 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 08:03:13.949341 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.949257 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.949341 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.949299 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t75z4\" (UniqueName: \"kubernetes.io/projected/e1d27989-b735-4d7f-b801-7b81443d7ba7-kube-api-access-t75z4\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:13.949521 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:13.949404 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:13.949521 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:13.949471 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls podName:e555087a-130b-4ab0-aaa8-92c983ed7e0b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:14.449454764 +0000 UTC m=+37.147411407 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls") pod "dns-default-zbswg" (UID: "e555087a-130b-4ab0-aaa8-92c983ed7e0b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:13.949521 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.949470 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdc7c\" (UniqueName: \"kubernetes.io/projected/e555087a-130b-4ab0-aaa8-92c983ed7e0b-kube-api-access-gdc7c\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.949521 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.949513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:13.949709 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.949534 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e555087a-130b-4ab0-aaa8-92c983ed7e0b-tmp-dir\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.949709 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:13.949637 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:13.949709 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.949665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e555087a-130b-4ab0-aaa8-92c983ed7e0b-config-volume\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.949709 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:13.949697 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert podName:e1d27989-b735-4d7f-b801-7b81443d7ba7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:14.449682371 +0000 UTC m=+37.147639018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert") pod "ingress-canary-gj4s7" (UID: "e1d27989-b735-4d7f-b801-7b81443d7ba7") : secret "canary-serving-cert" not found Apr 17 08:03:13.949838 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.949812 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e555087a-130b-4ab0-aaa8-92c983ed7e0b-tmp-dir\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.950104 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.950087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e555087a-130b-4ab0-aaa8-92c983ed7e0b-config-volume\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.958621 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.958593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdc7c\" (UniqueName: \"kubernetes.io/projected/e555087a-130b-4ab0-aaa8-92c983ed7e0b-kube-api-access-gdc7c\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:13.958742 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:13.958723 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t75z4\" (UniqueName: \"kubernetes.io/projected/e1d27989-b735-4d7f-b801-7b81443d7ba7-kube-api-access-t75z4\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:14.022638 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:14.022602 2580 generic.go:358] "Generic (PLEG): container finished" podID="18d4abe2-95b8-4158-acde-3d01b4526f60" containerID="21ab5c2b6976a24c08fea8e4c9d27467643b12475c61895a033e60b992d2b98b" exitCode=0 Apr 17 08:03:14.022767 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:14.022647 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerDied","Data":"21ab5c2b6976a24c08fea8e4c9d27467643b12475c61895a033e60b992d2b98b"} Apr 17 08:03:14.454065 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:14.454030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:14.454213 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:14.454098 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:14.454213 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:14.454183 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:14.454213 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:14.454183 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:14.454333 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:14.454246 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls podName:e555087a-130b-4ab0-aaa8-92c983ed7e0b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:15.454232843 +0000 UTC m=+38.152189486 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls") pod "dns-default-zbswg" (UID: "e555087a-130b-4ab0-aaa8-92c983ed7e0b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:14.454333 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:14.454260 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert podName:e1d27989-b735-4d7f-b801-7b81443d7ba7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:15.454254589 +0000 UTC m=+38.152211233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert") pod "ingress-canary-gj4s7" (UID: "e1d27989-b735-4d7f-b801-7b81443d7ba7") : secret "canary-serving-cert" not found Apr 17 08:03:14.865956 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:14.865908 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:14.868061 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:14.868039 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 08:03:14.868173 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:14.868068 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x2n8l\"" Apr 17 08:03:15.026976 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:15.026944 2580 generic.go:358] "Generic (PLEG): container finished" podID="18d4abe2-95b8-4158-acde-3d01b4526f60" containerID="a2a9ab244b097c03be2deced374bc7a4c49aedd8ff62d713bfd51bdd62452f68" exitCode=0 Apr 17 08:03:15.027139 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:15.026996 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerDied","Data":"a2a9ab244b097c03be2deced374bc7a4c49aedd8ff62d713bfd51bdd62452f68"} Apr 17 08:03:15.461990 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:15.461954 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:15.462158 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:15.462017 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:15.462158 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:15.462099 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:15.462158 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:15.462104 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:15.462158 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:15.462157 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert podName:e1d27989-b735-4d7f-b801-7b81443d7ba7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:17.462140603 +0000 UTC m=+40.160097250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert") pod "ingress-canary-gj4s7" (UID: "e1d27989-b735-4d7f-b801-7b81443d7ba7") : secret "canary-serving-cert" not found Apr 17 08:03:15.462317 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:15.462172 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls podName:e555087a-130b-4ab0-aaa8-92c983ed7e0b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:17.462165871 +0000 UTC m=+40.160122514 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls") pod "dns-default-zbswg" (UID: "e555087a-130b-4ab0-aaa8-92c983ed7e0b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:16.032188 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:16.032154 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" event={"ID":"18d4abe2-95b8-4158-acde-3d01b4526f60","Type":"ContainerStarted","Data":"e071a6aa897c3a294e068bd841d87b67d78e9dea0e0441f3dd285952e9583d75"} Apr 17 08:03:16.052640 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:16.052494 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q7p6p" podStartSLOduration=6.416623894 podStartE2EDuration="39.052474572s" podCreationTimestamp="2026-04-17 08:02:37 +0000 UTC" firstStartedPulling="2026-04-17 08:02:40.743717899 +0000 UTC m=+3.441674556" lastFinishedPulling="2026-04-17 08:03:13.37956859 +0000 UTC m=+36.077525234" observedRunningTime="2026-04-17 08:03:16.051556304 +0000 UTC m=+38.749512971" watchObservedRunningTime="2026-04-17 08:03:16.052474572 +0000 UTC m=+38.750431240" Apr 17 08:03:17.476463 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:17.476427 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:17.476893 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:17.476482 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:17.476893 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:17.476587 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:17.476893 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:17.476612 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:17.476893 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:17.476637 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert podName:e1d27989-b735-4d7f-b801-7b81443d7ba7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:21.476623536 +0000 UTC m=+44.174580179 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert") pod "ingress-canary-gj4s7" (UID: "e1d27989-b735-4d7f-b801-7b81443d7ba7") : secret "canary-serving-cert" not found Apr 17 08:03:17.476893 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:17.476676 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls podName:e555087a-130b-4ab0-aaa8-92c983ed7e0b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:21.476658173 +0000 UTC m=+44.174614820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls") pod "dns-default-zbswg" (UID: "e555087a-130b-4ab0-aaa8-92c983ed7e0b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:21.507017 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:21.506978 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:21.507462 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:21.507049 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:21.507462 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:21.507141 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:21.507462 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:21.507203 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert podName:e1d27989-b735-4d7f-b801-7b81443d7ba7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:29.50718589 +0000 UTC m=+52.205142538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert") pod "ingress-canary-gj4s7" (UID: "e1d27989-b735-4d7f-b801-7b81443d7ba7") : secret "canary-serving-cert" not found Apr 17 08:03:21.507462 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:21.507141 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:21.507462 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:21.507277 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls podName:e555087a-130b-4ab0-aaa8-92c983ed7e0b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:29.507264304 +0000 UTC m=+52.205220948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls") pod "dns-default-zbswg" (UID: "e555087a-130b-4ab0-aaa8-92c983ed7e0b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:29.564926 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:29.564704 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:29.565468 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:29.564954 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:29.565468 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:29.564854 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:29.565468 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:29.565045 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls podName:e555087a-130b-4ab0-aaa8-92c983ed7e0b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:45.565019204 +0000 UTC m=+68.262975856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls") pod "dns-default-zbswg" (UID: "e555087a-130b-4ab0-aaa8-92c983ed7e0b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:29.565468 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:29.565058 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:29.565468 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:29.565096 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert podName:e1d27989-b735-4d7f-b801-7b81443d7ba7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:45.565085244 +0000 UTC m=+68.263041887 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert") pod "ingress-canary-gj4s7" (UID: "e1d27989-b735-4d7f-b801-7b81443d7ba7") : secret "canary-serving-cert" not found Apr 17 08:03:39.021017 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:39.020984 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wwcgr" Apr 17 08:03:43.569095 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.569054 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:03:43.571127 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.571106 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 08:03:43.579785 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:43.579756 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 08:03:43.579855 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:43.579845 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:04:47.579825199 +0000 UTC m=+130.277781843 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : secret "metrics-daemon-secret" not found Apr 17 08:03:43.770602 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.770555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:43.773000 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.772978 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 08:03:43.783286 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.783261 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 08:03:43.794893 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.794854 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l49n\" (UniqueName: \"kubernetes.io/projected/2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2-kube-api-access-9l49n\") pod \"network-check-target-bhnbj\" (UID: \"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2\") " pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:43.883539 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.883459 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nfrwk\"" Apr 17 08:03:43.891973 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.891942 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:43.972195 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.972155 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:43.974446 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.974351 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 08:03:43.985113 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:43.985080 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28077202-06dd-4ed2-862a-f70c6f35f820-original-pull-secret\") pod \"global-pull-secret-syncer-6zncl\" (UID: \"28077202-06dd-4ed2-862a-f70c6f35f820\") " pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:44.053375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:44.053332 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bhnbj"] Apr 17 08:03:44.059664 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:03:44.058875 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9bd659_98fd_4a44_bf4a_2d4b8eb7ffb2.slice/crio-d204441a8aeb9bc9e31e2a3664c372978da08a46e13b63268c2248bbe2ebf67e WatchSource:0}: Error finding container d204441a8aeb9bc9e31e2a3664c372978da08a46e13b63268c2248bbe2ebf67e: Status 404 returned error can't find the container with id d204441a8aeb9bc9e31e2a3664c372978da08a46e13b63268c2248bbe2ebf67e Apr 17 08:03:44.087344 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:44.087309 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bhnbj" event={"ID":"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2","Type":"ContainerStarted","Data":"d204441a8aeb9bc9e31e2a3664c372978da08a46e13b63268c2248bbe2ebf67e"} Apr 17 08:03:44.175872 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:44.175835 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zncl" Apr 17 08:03:44.313802 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:44.313764 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6zncl"] Apr 17 08:03:44.318342 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:03:44.318313 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28077202_06dd_4ed2_862a_f70c6f35f820.slice/crio-673fabc09d889767b5850c6e8454fa4c5b15a73f533c60713cb5c1409922e485 WatchSource:0}: Error finding container 673fabc09d889767b5850c6e8454fa4c5b15a73f533c60713cb5c1409922e485: Status 404 returned error can't find the container with id 673fabc09d889767b5850c6e8454fa4c5b15a73f533c60713cb5c1409922e485 Apr 17 08:03:45.090846 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:45.090798 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6zncl" event={"ID":"28077202-06dd-4ed2-862a-f70c6f35f820","Type":"ContainerStarted","Data":"673fabc09d889767b5850c6e8454fa4c5b15a73f533c60713cb5c1409922e485"} Apr 17 08:03:45.584685 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:45.584640 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:03:45.584880 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:45.584727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:03:45.584880 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:45.584833 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:45.584880 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:45.584845 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:45.585078 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:45.584930 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls podName:e555087a-130b-4ab0-aaa8-92c983ed7e0b nodeName:}" failed. No retries permitted until 2026-04-17 08:04:17.584889144 +0000 UTC m=+100.282845789 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls") pod "dns-default-zbswg" (UID: "e555087a-130b-4ab0-aaa8-92c983ed7e0b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:45.585078 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:03:45.584952 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert podName:e1d27989-b735-4d7f-b801-7b81443d7ba7 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:17.58494096 +0000 UTC m=+100.282897610 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert") pod "ingress-canary-gj4s7" (UID: "e1d27989-b735-4d7f-b801-7b81443d7ba7") : secret "canary-serving-cert" not found Apr 17 08:03:49.099833 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:49.099737 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6zncl" event={"ID":"28077202-06dd-4ed2-862a-f70c6f35f820","Type":"ContainerStarted","Data":"37bfecc292c20a51b4eaf3d11b8ef1b1402da95395c0bb2e0bd4723a0d637595"} Apr 17 08:03:49.101163 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:49.101114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bhnbj" event={"ID":"2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2","Type":"ContainerStarted","Data":"f14a2137e7889c1e27d1007c1fe14e1af5520c9e5f95e61d4b8d405ea1a6c607"} Apr 17 08:03:49.101277 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:49.101256 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:03:49.112676 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:49.112627 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6zncl" podStartSLOduration=64.5770837 podStartE2EDuration="1m9.112613591s" podCreationTimestamp="2026-04-17 08:02:40 +0000 UTC" firstStartedPulling="2026-04-17 08:03:44.319906354 +0000 UTC m=+67.017862997" lastFinishedPulling="2026-04-17 08:03:48.855436228 +0000 UTC m=+71.553392888" observedRunningTime="2026-04-17 08:03:49.112292531 +0000 UTC m=+71.810249199" watchObservedRunningTime="2026-04-17 08:03:49.112613591 +0000 UTC m=+71.810570257" Apr 17 08:03:49.124341 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:49.124276 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bhnbj" podStartSLOduration=66.334083265 podStartE2EDuration="1m11.124261947s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="2026-04-17 08:03:44.061453973 +0000 UTC m=+66.759410617" lastFinishedPulling="2026-04-17 08:03:48.851632653 +0000 UTC m=+71.549589299" observedRunningTime="2026-04-17 08:03:49.123810579 +0000 UTC m=+71.821767257" watchObservedRunningTime="2026-04-17 08:03:49.124261947 +0000 UTC m=+71.822218607" Apr 17 08:03:53.128057 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.128025 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4"] Apr 17 08:03:53.131206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.131187 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.133287 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.133253 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 08:03:53.133287 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.133269 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 08:03:53.133287 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.133272 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 08:03:53.133787 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.133770 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 08:03:53.135540 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.135518 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z"] Apr 17 08:03:53.138442 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.138425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.141827 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.141804 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 08:03:53.141973 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.141846 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 08:03:53.141973 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.141887 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 08:03:53.142120 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.142097 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 08:03:53.142226 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.142117 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4"] Apr 17 08:03:53.151449 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.151419 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z"] Apr 17 08:03:53.241951 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.241890 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-tmp\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.241951 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.241954 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7df88737-fc1e-43b4-af0a-7d661a73b431-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.242206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.241975 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smjhx\" (UniqueName: \"kubernetes.io/projected/7df88737-fc1e-43b4-af0a-7d661a73b431-kube-api-access-smjhx\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.242206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.242015 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.242206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.242078 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-klusterlet-config\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.242206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.242115 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-ca\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.242206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.242137 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.242206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.242175 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmbh9\" (UniqueName: \"kubernetes.io/projected/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-kube-api-access-zmbh9\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.242206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.242191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-hub\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.342657 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.342607 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-ca\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.342657 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.342663 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.342900 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.342694 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmbh9\" (UniqueName: \"kubernetes.io/projected/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-kube-api-access-zmbh9\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.342900 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.342709 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-hub\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.342900 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.342736 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-tmp\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.342900 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.342752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7df88737-fc1e-43b4-af0a-7d661a73b431-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.342900 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.342772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smjhx\" (UniqueName: \"kubernetes.io/projected/7df88737-fc1e-43b4-af0a-7d661a73b431-kube-api-access-smjhx\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.343170 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.343003 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.343170 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.343071 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-klusterlet-config\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.343307 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.343275 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-tmp\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.343586 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.343566 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7df88737-fc1e-43b4-af0a-7d661a73b431-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.345416 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.345386 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-hub\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.345537 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.345418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.345636 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.345610 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-ca\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.345717 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.345698 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7df88737-fc1e-43b4-af0a-7d661a73b431-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.346165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.346150 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-klusterlet-config\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.349956 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.349933 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smjhx\" (UniqueName: \"kubernetes.io/projected/7df88737-fc1e-43b4-af0a-7d661a73b431-kube-api-access-smjhx\") pod \"cluster-proxy-proxy-agent-657fbf4bf6-58b2z\" (UID: \"7df88737-fc1e-43b4-af0a-7d661a73b431\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.350054 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.349963 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmbh9\" (UniqueName: \"kubernetes.io/projected/0a247d2c-af3b-4bac-a7c5-525dbdfa3803-kube-api-access-zmbh9\") pod \"klusterlet-addon-workmgr-577ff949cf-2k5w4\" (UID: \"0a247d2c-af3b-4bac-a7c5-525dbdfa3803\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.440787 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.440729 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:53.459752 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.459721 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:03:53.562378 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.562147 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4"] Apr 17 08:03:53.564885 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:03:53.564845 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a247d2c_af3b_4bac_a7c5_525dbdfa3803.slice/crio-29cde00bd35628b707ffa7164588d3a0b733fd3e1d17ce4861041c105af662e5 WatchSource:0}: Error finding container 29cde00bd35628b707ffa7164588d3a0b733fd3e1d17ce4861041c105af662e5: Status 404 returned error can't find the container with id 29cde00bd35628b707ffa7164588d3a0b733fd3e1d17ce4861041c105af662e5 Apr 17 08:03:53.589764 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:53.589727 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z"] Apr 17 08:03:53.593497 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:03:53.593466 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df88737_fc1e_43b4_af0a_7d661a73b431.slice/crio-1f06f44272e4b2c4a52f4a68555105b9c954ab0a51c64e6105250ac5a43644e5 WatchSource:0}: Error finding container 1f06f44272e4b2c4a52f4a68555105b9c954ab0a51c64e6105250ac5a43644e5: Status 404 returned error can't find the container with id 1f06f44272e4b2c4a52f4a68555105b9c954ab0a51c64e6105250ac5a43644e5 Apr 17 08:03:54.111140 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:54.111098 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" event={"ID":"7df88737-fc1e-43b4-af0a-7d661a73b431","Type":"ContainerStarted","Data":"1f06f44272e4b2c4a52f4a68555105b9c954ab0a51c64e6105250ac5a43644e5"} Apr 17 08:03:54.111972 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:54.111948 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" event={"ID":"0a247d2c-af3b-4bac-a7c5-525dbdfa3803","Type":"ContainerStarted","Data":"29cde00bd35628b707ffa7164588d3a0b733fd3e1d17ce4861041c105af662e5"} Apr 17 08:03:58.122314 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:58.122223 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" event={"ID":"7df88737-fc1e-43b4-af0a-7d661a73b431","Type":"ContainerStarted","Data":"ff609b18b315878bc3987ebb26836a9fe267232d3af17a49d6ac822ce5681934"} Apr 17 08:03:58.123487 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:58.123462 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" event={"ID":"0a247d2c-af3b-4bac-a7c5-525dbdfa3803","Type":"ContainerStarted","Data":"c05997a23b998917f7294f9eb993e467628438d08838af8048da0ae0b286ec1b"} Apr 17 08:03:58.123680 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:58.123660 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:58.125278 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:58.125257 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" Apr 17 08:03:58.138955 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:03:58.138900 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-577ff949cf-2k5w4" podStartSLOduration=0.920199831 podStartE2EDuration="5.138887086s" podCreationTimestamp="2026-04-17 08:03:53 +0000 UTC" firstStartedPulling="2026-04-17 08:03:53.566638444 +0000 UTC m=+76.264595088" lastFinishedPulling="2026-04-17 08:03:57.785325695 +0000 UTC m=+80.483282343" observedRunningTime="2026-04-17 08:03:58.137496665 +0000 UTC m=+80.835453344" watchObservedRunningTime="2026-04-17 08:03:58.138887086 +0000 UTC m=+80.836843751" Apr 17 08:04:00.132506 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:00.132471 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" event={"ID":"7df88737-fc1e-43b4-af0a-7d661a73b431","Type":"ContainerStarted","Data":"6a1413b741fb3dae05ef6e38ce9e76b9e6516677658684966ecb0e36a3a01b8a"} Apr 17 08:04:00.132506 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:00.132506 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" event={"ID":"7df88737-fc1e-43b4-af0a-7d661a73b431","Type":"ContainerStarted","Data":"4b28c9656848ed038feb7313b8101a4c8066ec17bb92b7b11a7e9d8ab8f380e4"} Apr 17 08:04:00.160972 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:00.160908 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" podStartSLOduration=1.137592785 podStartE2EDuration="7.160889998s" podCreationTimestamp="2026-04-17 08:03:53 +0000 UTC" firstStartedPulling="2026-04-17 08:03:53.59524294 +0000 UTC m=+76.293199584" lastFinishedPulling="2026-04-17 08:03:59.618540152 +0000 UTC m=+82.316496797" observedRunningTime="2026-04-17 08:04:00.149440995 +0000 UTC m=+82.847397663" watchObservedRunningTime="2026-04-17 08:04:00.160889998 +0000 UTC m=+82.858846696" Apr 17 08:04:17.627413 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:17.627309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:04:17.627413 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:17.627394 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:04:17.627889 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:04:17.627464 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:04:17.627889 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:04:17.627511 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:04:17.627889 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:04:17.627535 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert podName:e1d27989-b735-4d7f-b801-7b81443d7ba7 nodeName:}" failed. No retries permitted until 2026-04-17 08:05:21.62751932 +0000 UTC m=+164.325475964 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert") pod "ingress-canary-gj4s7" (UID: "e1d27989-b735-4d7f-b801-7b81443d7ba7") : secret "canary-serving-cert" not found Apr 17 08:04:17.627889 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:04:17.627580 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls podName:e555087a-130b-4ab0-aaa8-92c983ed7e0b nodeName:}" failed. No retries permitted until 2026-04-17 08:05:21.627561167 +0000 UTC m=+164.325517831 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls") pod "dns-default-zbswg" (UID: "e555087a-130b-4ab0-aaa8-92c983ed7e0b") : secret "dns-default-metrics-tls" not found Apr 17 08:04:20.105157 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:20.105120 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bhnbj" Apr 17 08:04:26.865756 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:26.865730 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sv4zj_62832cab-fee8-498a-b9dd-d410e9f3e921/dns-node-resolver/0.log" Apr 17 08:04:27.265580 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:27.265551 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lf6p6_d06f7085-abd8-4770-bc98-8794c5f1a056/node-ca/0.log" Apr 17 08:04:47.652185 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:47.652131 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:04:47.652680 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:04:47.652280 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 08:04:47.652680 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:04:47.652357 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs podName:5c94e060-29ca-49bd-9d62-210b4628adef nodeName:}" failed. No retries permitted until 2026-04-17 08:06:49.652340766 +0000 UTC m=+252.350297409 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs") pod "network-metrics-daemon-r9td5" (UID: "5c94e060-29ca-49bd-9d62-210b4628adef") : secret "metrics-daemon-secret" not found Apr 17 08:04:58.711717 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.711682 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-td59z"] Apr 17 08:04:58.714729 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.714711 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.717075 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.717049 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 08:04:58.717247 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.717228 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 08:04:58.717806 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.717790 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5w4p6\"" Apr 17 08:04:58.717869 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.717810 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 08:04:58.717869 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.717843 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 08:04:58.731475 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.731445 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-td59z"] Apr 17 08:04:58.733202 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.733170 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/794bfcee-a6d5-46eb-81ca-623c6c1871af-crio-socket\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.733389 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.733214 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/794bfcee-a6d5-46eb-81ca-623c6c1871af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.733389 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.733274 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzwjw\" (UniqueName: \"kubernetes.io/projected/794bfcee-a6d5-46eb-81ca-623c6c1871af-kube-api-access-tzwjw\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.733389 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.733352 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/794bfcee-a6d5-46eb-81ca-623c6c1871af-data-volume\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.733510 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.733398 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/794bfcee-a6d5-46eb-81ca-623c6c1871af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.758492 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.758456 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8c884d54b-ths49"] Apr 17 08:04:58.761230 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.761213 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.763203 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.763177 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 08:04:58.763325 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.763250 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5m5pl\"" Apr 17 08:04:58.763325 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.763315 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 08:04:58.763435 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.763251 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 08:04:58.769150 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.769124 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 08:04:58.775259 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.775234 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8c884d54b-ths49"] Apr 17 08:04:58.834142 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834097 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-installation-pull-secrets\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.834142 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834149 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-ca-trust-extracted\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.834375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834166 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-trusted-ca\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.834375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-bound-sa-token\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.834375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834215 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/794bfcee-a6d5-46eb-81ca-623c6c1871af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.834375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834233 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-image-registry-private-configuration\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.834375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834265 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzwjw\" (UniqueName: \"kubernetes.io/projected/794bfcee-a6d5-46eb-81ca-623c6c1871af-kube-api-access-tzwjw\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.834375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/794bfcee-a6d5-46eb-81ca-623c6c1871af-crio-socket\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.834375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/794bfcee-a6d5-46eb-81ca-623c6c1871af-data-volume\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.834375 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-registry-tls\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.834716 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834382 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/794bfcee-a6d5-46eb-81ca-623c6c1871af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.834716 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834405 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/794bfcee-a6d5-46eb-81ca-623c6c1871af-crio-socket\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.834716 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834449 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmpl2\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-kube-api-access-wmpl2\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.834716 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.834474 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-registry-certificates\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.835270 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.835249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/794bfcee-a6d5-46eb-81ca-623c6c1871af-data-volume\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.835467 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.835450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/794bfcee-a6d5-46eb-81ca-623c6c1871af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.837030 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.837014 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/794bfcee-a6d5-46eb-81ca-623c6c1871af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.842227 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.842200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzwjw\" (UniqueName: \"kubernetes.io/projected/794bfcee-a6d5-46eb-81ca-623c6c1871af-kube-api-access-tzwjw\") pod \"insights-runtime-extractor-td59z\" (UID: \"794bfcee-a6d5-46eb-81ca-623c6c1871af\") " pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:58.935174 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935126 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-image-registry-private-configuration\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.935376 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935207 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-registry-tls\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.935376 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935252 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmpl2\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-kube-api-access-wmpl2\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.935376 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935275 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-registry-certificates\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.935376 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935316 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-installation-pull-secrets\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.935376 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-ca-trust-extracted\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.935376 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-trusted-ca\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.935664 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935400 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-bound-sa-token\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.936013 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.935988 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-ca-trust-extracted\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.936429 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.936402 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-registry-certificates\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.936789 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.936764 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-trusted-ca\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.937695 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.937671 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-installation-pull-secrets\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.937791 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.937766 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-image-registry-private-configuration\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.937952 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.937933 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-registry-tls\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.945281 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.945254 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-bound-sa-token\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:58.945491 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:58.945469 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmpl2\" (UniqueName: \"kubernetes.io/projected/cff45cdf-b293-4f29-9fc9-ac02d70f69dc-kube-api-access-wmpl2\") pod \"image-registry-8c884d54b-ths49\" (UID: \"cff45cdf-b293-4f29-9fc9-ac02d70f69dc\") " pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:59.024347 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:59.024252 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-td59z" Apr 17 08:04:59.071076 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:59.070945 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:04:59.148044 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:59.148010 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-td59z"] Apr 17 08:04:59.151120 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:04:59.151085 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod794bfcee_a6d5_46eb_81ca_623c6c1871af.slice/crio-1fb0ad64178ebfd8b73c6f2c8633b34699b3f28b4e2d3c7ab34455f741248800 WatchSource:0}: Error finding container 1fb0ad64178ebfd8b73c6f2c8633b34699b3f28b4e2d3c7ab34455f741248800: Status 404 returned error can't find the container with id 1fb0ad64178ebfd8b73c6f2c8633b34699b3f28b4e2d3c7ab34455f741248800 Apr 17 08:04:59.207084 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:59.207053 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8c884d54b-ths49"] Apr 17 08:04:59.209671 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:04:59.209644 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff45cdf_b293_4f29_9fc9_ac02d70f69dc.slice/crio-d27340f3436d4536ad138047534b0390267da9bb85af8cc63f9f7b496fc1cf48 WatchSource:0}: Error finding container d27340f3436d4536ad138047534b0390267da9bb85af8cc63f9f7b496fc1cf48: Status 404 returned error can't find the container with id d27340f3436d4536ad138047534b0390267da9bb85af8cc63f9f7b496fc1cf48 Apr 17 08:04:59.278209 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:59.278119 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8c884d54b-ths49" event={"ID":"cff45cdf-b293-4f29-9fc9-ac02d70f69dc","Type":"ContainerStarted","Data":"fb655d86a11c2f40299b1fd70cb9351c31136ce219ad60e2745be6040f95f7cd"} Apr 17 08:04:59.278209 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:59.278161 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8c884d54b-ths49" event={"ID":"cff45cdf-b293-4f29-9fc9-ac02d70f69dc","Type":"ContainerStarted","Data":"d27340f3436d4536ad138047534b0390267da9bb85af8cc63f9f7b496fc1cf48"} Apr 17 08:04:59.279320 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:59.279297 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-td59z" event={"ID":"794bfcee-a6d5-46eb-81ca-623c6c1871af","Type":"ContainerStarted","Data":"9ef122dc617912021d005df70f397c881e78a7b67e39cde72c62daef07067242"} Apr 17 08:04:59.279434 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:04:59.279327 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-td59z" event={"ID":"794bfcee-a6d5-46eb-81ca-623c6c1871af","Type":"ContainerStarted","Data":"1fb0ad64178ebfd8b73c6f2c8633b34699b3f28b4e2d3c7ab34455f741248800"} Apr 17 08:05:00.283835 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:00.283776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-td59z" event={"ID":"794bfcee-a6d5-46eb-81ca-623c6c1871af","Type":"ContainerStarted","Data":"c7a4efa84832f7730ace318981aa921de7f11d4da471446986ea956f368da2c9"} Apr 17 08:05:00.284313 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:00.283926 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:05:00.303528 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:00.303464 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8c884d54b-ths49" podStartSLOduration=2.303441881 podStartE2EDuration="2.303441881s" podCreationTimestamp="2026-04-17 08:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:05:00.302400477 +0000 UTC m=+143.000357168" watchObservedRunningTime="2026-04-17 08:05:00.303441881 +0000 UTC m=+143.001398548" Apr 17 08:05:02.290444 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:02.290403 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-td59z" event={"ID":"794bfcee-a6d5-46eb-81ca-623c6c1871af","Type":"ContainerStarted","Data":"3e425efaedfc6e8715e16d5ebd3392668ad4f57c3fbaf8befe89d9950a7f6338"} Apr 17 08:05:02.306315 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:02.306259 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-td59z" podStartSLOduration=2.092561957 podStartE2EDuration="4.306243572s" podCreationTimestamp="2026-04-17 08:04:58 +0000 UTC" firstStartedPulling="2026-04-17 08:04:59.210812009 +0000 UTC m=+141.908768658" lastFinishedPulling="2026-04-17 08:05:01.424493629 +0000 UTC m=+144.122450273" observedRunningTime="2026-04-17 08:05:02.305134012 +0000 UTC m=+145.003090702" watchObservedRunningTime="2026-04-17 08:05:02.306243572 +0000 UTC m=+145.004200290" Apr 17 08:05:07.288396 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.288358 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-b6brf"] Apr 17 08:05:07.293208 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.293181 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.295392 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.295364 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 08:05:07.295392 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.295370 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 08:05:07.295985 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.295960 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 08:05:07.295985 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.295964 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 08:05:07.296156 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.295964 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 08:05:07.296156 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.296044 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hwh7w\"" Apr 17 08:05:07.296156 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.296110 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 08:05:07.401594 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401556 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmwxh\" (UniqueName: \"kubernetes.io/projected/ae5e4541-4bb5-425b-a511-1291960981fc-kube-api-access-wmwxh\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.401594 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401599 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-sys\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.401849 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401625 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-accelerators-collector-config\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.401849 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401685 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-wtmp\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.401849 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401726 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.401849 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401751 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5e4541-4bb5-425b-a511-1291960981fc-metrics-client-ca\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.401849 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-textfile\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.401849 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401842 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-tls\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.402073 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.401884 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-root\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502484 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-wtmp\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502640 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502640 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502533 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5e4541-4bb5-425b-a511-1291960981fc-metrics-client-ca\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502640 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502563 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-textfile\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502640 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502589 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-tls\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502876 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502654 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-wtmp\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502876 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502670 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-root\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502876 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502728 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-root\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502876 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502799 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmwxh\" (UniqueName: \"kubernetes.io/projected/ae5e4541-4bb5-425b-a511-1291960981fc-kube-api-access-wmwxh\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502876 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502828 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-sys\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.502876 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502855 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-accelerators-collector-config\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.503197 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.502938 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae5e4541-4bb5-425b-a511-1291960981fc-sys\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.503248 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.503225 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5e4541-4bb5-425b-a511-1291960981fc-metrics-client-ca\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.503248 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.503233 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-textfile\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.503408 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.503387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-accelerators-collector-config\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.504941 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.504907 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.505091 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.505072 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae5e4541-4bb5-425b-a511-1291960981fc-node-exporter-tls\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.510066 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.510037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmwxh\" (UniqueName: \"kubernetes.io/projected/ae5e4541-4bb5-425b-a511-1291960981fc-kube-api-access-wmwxh\") pod \"node-exporter-b6brf\" (UID: \"ae5e4541-4bb5-425b-a511-1291960981fc\") " pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.602449 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:07.602366 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b6brf" Apr 17 08:05:07.611140 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:05:07.611110 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae5e4541_4bb5_425b_a511_1291960981fc.slice/crio-4decf1bd1ffeb9f4a486fb14a9494167d31b8ab83d5696bbbb5fa020ad55f156 WatchSource:0}: Error finding container 4decf1bd1ffeb9f4a486fb14a9494167d31b8ab83d5696bbbb5fa020ad55f156: Status 404 returned error can't find the container with id 4decf1bd1ffeb9f4a486fb14a9494167d31b8ab83d5696bbbb5fa020ad55f156 Apr 17 08:05:08.305151 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:08.305114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b6brf" event={"ID":"ae5e4541-4bb5-425b-a511-1291960981fc","Type":"ContainerStarted","Data":"4decf1bd1ffeb9f4a486fb14a9494167d31b8ab83d5696bbbb5fa020ad55f156"} Apr 17 08:05:09.309334 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:09.309295 2580 generic.go:358] "Generic (PLEG): container finished" podID="ae5e4541-4bb5-425b-a511-1291960981fc" containerID="d956681d0a11a4accbb77f2369257ff72edd4abbfb78a9662da8282d1e4ed1ed" exitCode=0 Apr 17 08:05:09.309827 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:09.309362 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b6brf" event={"ID":"ae5e4541-4bb5-425b-a511-1291960981fc","Type":"ContainerDied","Data":"d956681d0a11a4accbb77f2369257ff72edd4abbfb78a9662da8282d1e4ed1ed"} Apr 17 08:05:10.316807 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:10.316771 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b6brf" event={"ID":"ae5e4541-4bb5-425b-a511-1291960981fc","Type":"ContainerStarted","Data":"f80e8b6262b5743fdd2090b6aeed7325e56f9f432278dd135528f4c72387d015"} Apr 17 08:05:10.316807 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:10.316810 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b6brf" event={"ID":"ae5e4541-4bb5-425b-a511-1291960981fc","Type":"ContainerStarted","Data":"ab51a1aea1a2b4b03d4e66ec5e8af808e747fffdb743989793cea30668cfc7ae"} Apr 17 08:05:10.334208 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:10.334149 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-b6brf" podStartSLOduration=2.467875908 podStartE2EDuration="3.334133852s" podCreationTimestamp="2026-04-17 08:05:07 +0000 UTC" firstStartedPulling="2026-04-17 08:05:07.613194113 +0000 UTC m=+150.311150756" lastFinishedPulling="2026-04-17 08:05:08.479452051 +0000 UTC m=+151.177408700" observedRunningTime="2026-04-17 08:05:10.332588946 +0000 UTC m=+153.030545626" watchObservedRunningTime="2026-04-17 08:05:10.334133852 +0000 UTC m=+153.032090517" Apr 17 08:05:16.684023 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:05:16.683966 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zbswg" podUID="e555087a-130b-4ab0-aaa8-92c983ed7e0b" Apr 17 08:05:16.689154 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:05:16.689124 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gj4s7" podUID="e1d27989-b735-4d7f-b801-7b81443d7ba7" Apr 17 08:05:17.331354 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:17.331332 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zbswg" Apr 17 08:05:17.331487 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:17.331331 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:05:17.874961 ip-10-0-128-245 kubenswrapper[2580]: E0417 08:05:17.874885 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-r9td5" podUID="5c94e060-29ca-49bd-9d62-210b4628adef" Apr 17 08:05:19.075402 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:19.075355 2580 patch_prober.go:28] interesting pod/image-registry-8c884d54b-ths49 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:19.075865 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:19.075430 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-8c884d54b-ths49" podUID="cff45cdf-b293-4f29-9fc9-ac02d70f69dc" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:21.291097 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.291063 2580 patch_prober.go:28] interesting pod/image-registry-8c884d54b-ths49 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:21.291483 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.291135 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-8c884d54b-ths49" podUID="cff45cdf-b293-4f29-9fc9-ac02d70f69dc" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:21.722760 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.722715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:05:21.722960 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.722801 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:05:21.725173 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.725143 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e555087a-130b-4ab0-aaa8-92c983ed7e0b-metrics-tls\") pod \"dns-default-zbswg\" (UID: \"e555087a-130b-4ab0-aaa8-92c983ed7e0b\") " pod="openshift-dns/dns-default-zbswg" Apr 17 08:05:21.725316 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.725294 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1d27989-b735-4d7f-b801-7b81443d7ba7-cert\") pod \"ingress-canary-gj4s7\" (UID: \"e1d27989-b735-4d7f-b801-7b81443d7ba7\") " pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:05:21.835096 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.835064 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wdhxs\"" Apr 17 08:05:21.835096 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.835065 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x9bwm\"" Apr 17 08:05:21.842974 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.842943 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gj4s7" Apr 17 08:05:21.842974 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.842967 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zbswg" Apr 17 08:05:21.994926 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:21.994882 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zbswg"] Apr 17 08:05:21.997818 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:05:21.997779 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode555087a_130b_4ab0_aaa8_92c983ed7e0b.slice/crio-f04e6419a1498a30e5fb9b3e3b8acdb1aeca041563d130d315f77678de7331bd WatchSource:0}: Error finding container f04e6419a1498a30e5fb9b3e3b8acdb1aeca041563d130d315f77678de7331bd: Status 404 returned error can't find the container with id f04e6419a1498a30e5fb9b3e3b8acdb1aeca041563d130d315f77678de7331bd Apr 17 08:05:22.010939 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:22.010763 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gj4s7"] Apr 17 08:05:22.013654 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:05:22.013612 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d27989_b735_4d7f_b801_7b81443d7ba7.slice/crio-9acf29a334591d641d15e0820c3f3d5e634bc369840c492e35698075e22af10e WatchSource:0}: Error finding container 9acf29a334591d641d15e0820c3f3d5e634bc369840c492e35698075e22af10e: Status 404 returned error can't find the container with id 9acf29a334591d641d15e0820c3f3d5e634bc369840c492e35698075e22af10e Apr 17 08:05:22.343639 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:22.343548 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zbswg" event={"ID":"e555087a-130b-4ab0-aaa8-92c983ed7e0b","Type":"ContainerStarted","Data":"f04e6419a1498a30e5fb9b3e3b8acdb1aeca041563d130d315f77678de7331bd"} Apr 17 08:05:22.344542 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:22.344517 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gj4s7" event={"ID":"e1d27989-b735-4d7f-b801-7b81443d7ba7","Type":"ContainerStarted","Data":"9acf29a334591d641d15e0820c3f3d5e634bc369840c492e35698075e22af10e"} Apr 17 08:05:23.461634 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:23.461577 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" podUID="7df88737-fc1e-43b4-af0a-7d661a73b431" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 08:05:24.352121 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:24.352078 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gj4s7" event={"ID":"e1d27989-b735-4d7f-b801-7b81443d7ba7","Type":"ContainerStarted","Data":"6479c03c5edaba9326d91d11e8fa00c5a11b02ab885c39d4e2fb05da93b3e996"} Apr 17 08:05:24.353703 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:24.353677 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zbswg" event={"ID":"e555087a-130b-4ab0-aaa8-92c983ed7e0b","Type":"ContainerStarted","Data":"8201da067105693ca77a7494e51b398513146446b936d4c12c9cd62142d395d0"} Apr 17 08:05:24.353703 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:24.353705 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zbswg" event={"ID":"e555087a-130b-4ab0-aaa8-92c983ed7e0b","Type":"ContainerStarted","Data":"a0d2a911acdc2f29d24c80361358c4e52e636a18f70179050c4c7b5c5605dedb"} Apr 17 08:05:24.353877 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:24.353846 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zbswg" Apr 17 08:05:24.366139 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:24.366080 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gj4s7" podStartSLOduration=129.179855465 podStartE2EDuration="2m11.366063603s" podCreationTimestamp="2026-04-17 08:03:13 +0000 UTC" firstStartedPulling="2026-04-17 08:05:22.015609416 +0000 UTC m=+164.713566059" lastFinishedPulling="2026-04-17 08:05:24.201817544 +0000 UTC m=+166.899774197" observedRunningTime="2026-04-17 08:05:24.365746057 +0000 UTC m=+167.063702747" watchObservedRunningTime="2026-04-17 08:05:24.366063603 +0000 UTC m=+167.064020268" Apr 17 08:05:24.382789 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:24.382736 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zbswg" podStartSLOduration=130.101981656 podStartE2EDuration="2m11.382720419s" podCreationTimestamp="2026-04-17 08:03:13 +0000 UTC" firstStartedPulling="2026-04-17 08:05:21.999657234 +0000 UTC m=+164.697613878" lastFinishedPulling="2026-04-17 08:05:23.280395992 +0000 UTC m=+165.978352641" observedRunningTime="2026-04-17 08:05:24.381985428 +0000 UTC m=+167.079942095" watchObservedRunningTime="2026-04-17 08:05:24.382720419 +0000 UTC m=+167.080677085" Apr 17 08:05:28.865899 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:28.865849 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:05:29.075073 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:29.075033 2580 patch_prober.go:28] interesting pod/image-registry-8c884d54b-ths49 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:29.075236 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:29.075097 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-8c884d54b-ths49" podUID="cff45cdf-b293-4f29-9fc9-ac02d70f69dc" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:31.289954 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:31.289902 2580 patch_prober.go:28] interesting pod/image-registry-8c884d54b-ths49 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:31.290324 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:31.289980 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-8c884d54b-ths49" podUID="cff45cdf-b293-4f29-9fc9-ac02d70f69dc" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:33.461179 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:33.461142 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" podUID="7df88737-fc1e-43b4-af0a-7d661a73b431" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 08:05:34.358073 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:34.358041 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zbswg" Apr 17 08:05:39.075492 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:39.075454 2580 patch_prober.go:28] interesting pod/image-registry-8c884d54b-ths49 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:39.075984 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:39.075512 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-8c884d54b-ths49" podUID="cff45cdf-b293-4f29-9fc9-ac02d70f69dc" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:39.075984 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:39.075555 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:05:39.076102 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:39.076022 2580 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"fb655d86a11c2f40299b1fd70cb9351c31136ce219ad60e2745be6040f95f7cd"} pod="openshift-image-registry/image-registry-8c884d54b-ths49" containerMessage="Container registry failed liveness probe, will be restarted" Apr 17 08:05:39.079385 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:39.079356 2580 patch_prober.go:28] interesting pod/image-registry-8c884d54b-ths49 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:39.079508 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:39.079406 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-8c884d54b-ths49" podUID="cff45cdf-b293-4f29-9fc9-ac02d70f69dc" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:41.637847 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:41.637820 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zbswg_e555087a-130b-4ab0-aaa8-92c983ed7e0b/dns/0.log" Apr 17 08:05:41.836847 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:41.836819 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zbswg_e555087a-130b-4ab0-aaa8-92c983ed7e0b/kube-rbac-proxy/0.log" Apr 17 08:05:42.437098 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:42.437071 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sv4zj_62832cab-fee8-498a-b9dd-d410e9f3e921/dns-node-resolver/0.log" Apr 17 08:05:42.836875 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:42.836800 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8c884d54b-ths49_cff45cdf-b293-4f29-9fc9-ac02d70f69dc/registry/0.log" Apr 17 08:05:43.036200 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:43.036174 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lf6p6_d06f7085-abd8-4770-bc98-8794c5f1a056/node-ca/0.log" Apr 17 08:05:43.461531 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:43.461493 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" podUID="7df88737-fc1e-43b4-af0a-7d661a73b431" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 08:05:43.461683 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:43.461570 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" Apr 17 08:05:43.462173 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:43.462144 2580 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"6a1413b741fb3dae05ef6e38ce9e76b9e6516677658684966ecb0e36a3a01b8a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 08:05:43.462270 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:43.462200 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" podUID="7df88737-fc1e-43b4-af0a-7d661a73b431" containerName="service-proxy" containerID="cri-o://6a1413b741fb3dae05ef6e38ce9e76b9e6516677658684966ecb0e36a3a01b8a" gracePeriod=30 Apr 17 08:05:43.836822 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:43.836711 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gj4s7_e1d27989-b735-4d7f-b801-7b81443d7ba7/serve-healthcheck-canary/0.log" Apr 17 08:05:44.406001 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:44.405965 2580 generic.go:358] "Generic (PLEG): container finished" podID="7df88737-fc1e-43b4-af0a-7d661a73b431" containerID="6a1413b741fb3dae05ef6e38ce9e76b9e6516677658684966ecb0e36a3a01b8a" exitCode=2 Apr 17 08:05:44.406376 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:44.406010 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" event={"ID":"7df88737-fc1e-43b4-af0a-7d661a73b431","Type":"ContainerDied","Data":"6a1413b741fb3dae05ef6e38ce9e76b9e6516677658684966ecb0e36a3a01b8a"} Apr 17 08:05:44.406376 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:44.406033 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-657fbf4bf6-58b2z" event={"ID":"7df88737-fc1e-43b4-af0a-7d661a73b431","Type":"ContainerStarted","Data":"802085d59c20fe7140ddbda78c77cc57865f69d51c29c13c69a0f21483f733be"} Apr 17 08:05:49.079925 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:05:49.079881 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:06:04.094295 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:04.094249 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8c884d54b-ths49" podUID="cff45cdf-b293-4f29-9fc9-ac02d70f69dc" containerName="registry" containerID="cri-o://fb655d86a11c2f40299b1fd70cb9351c31136ce219ad60e2745be6040f95f7cd" gracePeriod=30 Apr 17 08:06:05.468052 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:05.467956 2580 generic.go:358] "Generic (PLEG): container finished" podID="cff45cdf-b293-4f29-9fc9-ac02d70f69dc" containerID="fb655d86a11c2f40299b1fd70cb9351c31136ce219ad60e2745be6040f95f7cd" exitCode=0 Apr 17 08:06:05.468398 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:05.468070 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8c884d54b-ths49" event={"ID":"cff45cdf-b293-4f29-9fc9-ac02d70f69dc","Type":"ContainerDied","Data":"fb655d86a11c2f40299b1fd70cb9351c31136ce219ad60e2745be6040f95f7cd"} Apr 17 08:06:05.468398 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:05.468099 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8c884d54b-ths49" event={"ID":"cff45cdf-b293-4f29-9fc9-ac02d70f69dc","Type":"ContainerStarted","Data":"1d8caf52e9ac3fa4f7d071641dd8edb53873cffa92b8895b90cb81b934335718"} Apr 17 08:06:05.468398 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:05.468207 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:06:26.475172 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:26.475136 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8c884d54b-ths49" Apr 17 08:06:49.712141 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:49.712047 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:06:49.714291 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:49.714272 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c94e060-29ca-49bd-9d62-210b4628adef-metrics-certs\") pod \"network-metrics-daemon-r9td5\" (UID: \"5c94e060-29ca-49bd-9d62-210b4628adef\") " pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:06:49.868330 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:49.868292 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x2n8l\"" Apr 17 08:06:49.876994 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:49.876963 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r9td5" Apr 17 08:06:49.995611 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:49.995526 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r9td5"] Apr 17 08:06:49.999510 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:06:49.999479 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c94e060_29ca_49bd_9d62_210b4628adef.slice/crio-54f333ed80a7b6d7193e521becb1c852b47bcf200a58972cd59f5a407fd0eb09 WatchSource:0}: Error finding container 54f333ed80a7b6d7193e521becb1c852b47bcf200a58972cd59f5a407fd0eb09: Status 404 returned error can't find the container with id 54f333ed80a7b6d7193e521becb1c852b47bcf200a58972cd59f5a407fd0eb09 Apr 17 08:06:50.585486 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:50.585440 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r9td5" event={"ID":"5c94e060-29ca-49bd-9d62-210b4628adef","Type":"ContainerStarted","Data":"54f333ed80a7b6d7193e521becb1c852b47bcf200a58972cd59f5a407fd0eb09"} Apr 17 08:06:51.590267 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:51.590226 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r9td5" event={"ID":"5c94e060-29ca-49bd-9d62-210b4628adef","Type":"ContainerStarted","Data":"20dd3e0c21f7debc94d86f8e7fa3d103d959123f2eae6793b2e389ec1eca1438"} Apr 17 08:06:51.590267 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:51.590270 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r9td5" event={"ID":"5c94e060-29ca-49bd-9d62-210b4628adef","Type":"ContainerStarted","Data":"92f335f3acc8366c46e43282b98881155fc108432968a81ef91c871e461ef0db"} Apr 17 08:06:51.606279 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:06:51.606223 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-r9td5" podStartSLOduration=253.515759319 podStartE2EDuration="4m14.606204132s" podCreationTimestamp="2026-04-17 08:02:37 +0000 UTC" firstStartedPulling="2026-04-17 08:06:50.00137401 +0000 UTC m=+252.699330654" lastFinishedPulling="2026-04-17 08:06:51.091818819 +0000 UTC m=+253.789775467" observedRunningTime="2026-04-17 08:06:51.60464005 +0000 UTC m=+254.302596715" watchObservedRunningTime="2026-04-17 08:06:51.606204132 +0000 UTC m=+254.304160856" Apr 17 08:07:37.776083 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:07:37.776048 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 08:08:30.162061 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.162022 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5"] Apr 17 08:08:30.164951 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.164935 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" Apr 17 08:08:30.167013 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.166991 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 08:08:30.167676 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.167659 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-gb6v4\"" Apr 17 08:08:30.167782 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.167679 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:08:30.172319 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.172293 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5"] Apr 17 08:08:30.309710 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.309675 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/051f4148-f469-4379-a909-ef3291283df0-tmp\") pod \"jobset-operator-747c5859c7-bw9b5\" (UID: \"051f4148-f469-4379-a909-ef3291283df0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" Apr 17 08:08:30.309710 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.309721 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b642b\" (UniqueName: \"kubernetes.io/projected/051f4148-f469-4379-a909-ef3291283df0-kube-api-access-b642b\") pod \"jobset-operator-747c5859c7-bw9b5\" (UID: \"051f4148-f469-4379-a909-ef3291283df0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" Apr 17 08:08:30.411007 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.410953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/051f4148-f469-4379-a909-ef3291283df0-tmp\") pod \"jobset-operator-747c5859c7-bw9b5\" (UID: \"051f4148-f469-4379-a909-ef3291283df0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" Apr 17 08:08:30.411137 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.411022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b642b\" (UniqueName: \"kubernetes.io/projected/051f4148-f469-4379-a909-ef3291283df0-kube-api-access-b642b\") pod \"jobset-operator-747c5859c7-bw9b5\" (UID: \"051f4148-f469-4379-a909-ef3291283df0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" Apr 17 08:08:30.411356 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.411338 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/051f4148-f469-4379-a909-ef3291283df0-tmp\") pod \"jobset-operator-747c5859c7-bw9b5\" (UID: \"051f4148-f469-4379-a909-ef3291283df0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" Apr 17 08:08:30.419068 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.418989 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b642b\" (UniqueName: \"kubernetes.io/projected/051f4148-f469-4379-a909-ef3291283df0-kube-api-access-b642b\") pod \"jobset-operator-747c5859c7-bw9b5\" (UID: \"051f4148-f469-4379-a909-ef3291283df0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" Apr 17 08:08:30.473938 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.473884 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" Apr 17 08:08:30.591022 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.590832 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5"] Apr 17 08:08:30.593807 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:08:30.593775 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod051f4148_f469_4379_a909_ef3291283df0.slice/crio-1ca8036c3ddea823ae0996027469bad705138b2b6050d41469c9c9e5ddd3ddd0 WatchSource:0}: Error finding container 1ca8036c3ddea823ae0996027469bad705138b2b6050d41469c9c9e5ddd3ddd0: Status 404 returned error can't find the container with id 1ca8036c3ddea823ae0996027469bad705138b2b6050d41469c9c9e5ddd3ddd0 Apr 17 08:08:30.595793 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.595777 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:08:30.846206 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:30.846116 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" event={"ID":"051f4148-f469-4379-a909-ef3291283df0","Type":"ContainerStarted","Data":"1ca8036c3ddea823ae0996027469bad705138b2b6050d41469c9c9e5ddd3ddd0"} Apr 17 08:08:33.855492 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:33.855458 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" event={"ID":"051f4148-f469-4379-a909-ef3291283df0","Type":"ContainerStarted","Data":"71e580839de044406eb9630bb4308c4286a9f80b7d64cbc165b3f7ffb1dcf336"} Apr 17 08:08:33.870780 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:08:33.870728 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-bw9b5" podStartSLOduration=1.330499889 podStartE2EDuration="3.870713937s" podCreationTimestamp="2026-04-17 08:08:30 +0000 UTC" firstStartedPulling="2026-04-17 08:08:30.595899116 +0000 UTC m=+353.293855760" lastFinishedPulling="2026-04-17 08:08:33.136113161 +0000 UTC m=+355.834069808" observedRunningTime="2026-04-17 08:08:33.869463299 +0000 UTC m=+356.567419964" watchObservedRunningTime="2026-04-17 08:08:33.870713937 +0000 UTC m=+356.568670603" Apr 17 08:09:00.792209 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.792171 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522"] Apr 17 08:09:00.795235 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.795217 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.797535 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.797507 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 17 08:09:00.797535 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.797526 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 17 08:09:00.798008 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.797993 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 08:09:00.798065 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.798044 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-c7lhc\"" Apr 17 08:09:00.798108 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.798064 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 08:09:00.804885 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.804860 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522"] Apr 17 08:09:00.833462 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.833427 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.833645 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.833483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.833645 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.833543 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w55fk\" (UniqueName: \"kubernetes.io/projected/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-kube-api-access-w55fk\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.934580 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.934539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.934786 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.934616 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.934786 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.934655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w55fk\" (UniqueName: \"kubernetes.io/projected/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-kube-api-access-w55fk\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.935359 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.935339 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.937000 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.936976 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:00.943049 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:00.943022 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w55fk\" (UniqueName: \"kubernetes.io/projected/7c8bbbd4-50b5-405e-b8b4-7500b043aee3-kube-api-access-w55fk\") pod \"kubeflow-trainer-controller-manager-55f5694779-vg522\" (UID: \"7c8bbbd4-50b5-405e-b8b4-7500b043aee3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:01.104334 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:01.104239 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:01.223425 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:01.223391 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522"] Apr 17 08:09:01.227886 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:09:01.227853 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c8bbbd4_50b5_405e_b8b4_7500b043aee3.slice/crio-22bf347f69f901d7ce98651ad9066e320aafd34f5a4dd514992baa32478b590f WatchSource:0}: Error finding container 22bf347f69f901d7ce98651ad9066e320aafd34f5a4dd514992baa32478b590f: Status 404 returned error can't find the container with id 22bf347f69f901d7ce98651ad9066e320aafd34f5a4dd514992baa32478b590f Apr 17 08:09:01.929070 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:01.929016 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" event={"ID":"7c8bbbd4-50b5-405e-b8b4-7500b043aee3","Type":"ContainerStarted","Data":"22bf347f69f901d7ce98651ad9066e320aafd34f5a4dd514992baa32478b590f"} Apr 17 08:09:03.936232 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:03.936194 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" event={"ID":"7c8bbbd4-50b5-405e-b8b4-7500b043aee3","Type":"ContainerStarted","Data":"5dbd1736df3a1559ab62cac734c29caf9416f7879bc6854eff88f08f83739797"} Apr 17 08:09:03.936649 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:03.936287 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:03.952981 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:03.952892 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" podStartSLOduration=1.776787195 podStartE2EDuration="3.952870143s" podCreationTimestamp="2026-04-17 08:09:00 +0000 UTC" firstStartedPulling="2026-04-17 08:09:01.230145359 +0000 UTC m=+383.928102010" lastFinishedPulling="2026-04-17 08:09:03.406228309 +0000 UTC m=+386.104184958" observedRunningTime="2026-04-17 08:09:03.951685367 +0000 UTC m=+386.649642036" watchObservedRunningTime="2026-04-17 08:09:03.952870143 +0000 UTC m=+386.650826810" Apr 17 08:09:19.944867 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:19.944833 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-vg522" Apr 17 08:09:37.038488 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:37.038460 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-vg522_7c8bbbd4-50b5-405e-b8b4-7500b043aee3/manager/0.log" Apr 17 08:09:37.482134 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:37.482099 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-vg522_7c8bbbd4-50b5-405e-b8b4-7500b043aee3/manager/0.log" Apr 17 08:09:37.941165 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:09:37.941120 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-vg522_7c8bbbd4-50b5-405e-b8b4-7500b043aee3/manager/0.log" Apr 17 08:10:13.876124 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:13.876092 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-btznw/must-gather-5cn2v"] Apr 17 08:10:13.879191 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:13.879174 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-btznw/must-gather-5cn2v" Apr 17 08:10:13.882481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:13.882463 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-btznw\"/\"kube-root-ca.crt\"" Apr 17 08:10:13.882962 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:13.882947 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-btznw\"/\"default-dockercfg-z96wr\"" Apr 17 08:10:13.883021 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:13.882972 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-btznw\"/\"openshift-service-ca.crt\"" Apr 17 08:10:13.895365 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:13.895336 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-btznw/must-gather-5cn2v"] Apr 17 08:10:13.926785 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:13.926747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58da10b3-c547-4b67-b4d4-dde474697137-must-gather-output\") pod \"must-gather-5cn2v\" (UID: \"58da10b3-c547-4b67-b4d4-dde474697137\") " pod="openshift-must-gather-btznw/must-gather-5cn2v" Apr 17 08:10:13.926785 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:13.926785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f78fz\" (UniqueName: \"kubernetes.io/projected/58da10b3-c547-4b67-b4d4-dde474697137-kube-api-access-f78fz\") pod \"must-gather-5cn2v\" (UID: \"58da10b3-c547-4b67-b4d4-dde474697137\") " pod="openshift-must-gather-btznw/must-gather-5cn2v" Apr 17 08:10:14.028110 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:14.028073 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58da10b3-c547-4b67-b4d4-dde474697137-must-gather-output\") pod \"must-gather-5cn2v\" (UID: \"58da10b3-c547-4b67-b4d4-dde474697137\") " pod="openshift-must-gather-btznw/must-gather-5cn2v" Apr 17 08:10:14.028110 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:14.028112 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f78fz\" (UniqueName: \"kubernetes.io/projected/58da10b3-c547-4b67-b4d4-dde474697137-kube-api-access-f78fz\") pod \"must-gather-5cn2v\" (UID: \"58da10b3-c547-4b67-b4d4-dde474697137\") " pod="openshift-must-gather-btznw/must-gather-5cn2v" Apr 17 08:10:14.028425 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:14.028403 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58da10b3-c547-4b67-b4d4-dde474697137-must-gather-output\") pod \"must-gather-5cn2v\" (UID: \"58da10b3-c547-4b67-b4d4-dde474697137\") " pod="openshift-must-gather-btznw/must-gather-5cn2v" Apr 17 08:10:14.035475 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:14.035441 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f78fz\" (UniqueName: \"kubernetes.io/projected/58da10b3-c547-4b67-b4d4-dde474697137-kube-api-access-f78fz\") pod \"must-gather-5cn2v\" (UID: \"58da10b3-c547-4b67-b4d4-dde474697137\") " pod="openshift-must-gather-btznw/must-gather-5cn2v" Apr 17 08:10:14.187766 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:14.187718 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-btznw/must-gather-5cn2v" Apr 17 08:10:14.304276 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:14.304239 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-btznw/must-gather-5cn2v"] Apr 17 08:10:14.307329 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:10:14.307300 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58da10b3_c547_4b67_b4d4_dde474697137.slice/crio-f89459f6ae7a064c8815ff579f55c20541c42f455a1381126d673c82d0246bc5 WatchSource:0}: Error finding container f89459f6ae7a064c8815ff579f55c20541c42f455a1381126d673c82d0246bc5: Status 404 returned error can't find the container with id f89459f6ae7a064c8815ff579f55c20541c42f455a1381126d673c82d0246bc5 Apr 17 08:10:15.118133 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:15.118091 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/must-gather-5cn2v" event={"ID":"58da10b3-c547-4b67-b4d4-dde474697137","Type":"ContainerStarted","Data":"f89459f6ae7a064c8815ff579f55c20541c42f455a1381126d673c82d0246bc5"} Apr 17 08:10:16.123350 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:16.123309 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/must-gather-5cn2v" event={"ID":"58da10b3-c547-4b67-b4d4-dde474697137","Type":"ContainerStarted","Data":"81e239f749e5d0bce1e705425948cd528f9e38555f7aba3be1e9c27fcb129036"} Apr 17 08:10:16.123765 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:16.123356 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/must-gather-5cn2v" event={"ID":"58da10b3-c547-4b67-b4d4-dde474697137","Type":"ContainerStarted","Data":"c4bc08218c090f56c2d5dbd72b158310f83f641fc223804ff3c711d930b25c49"} Apr 17 08:10:16.139481 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:16.139110 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-btznw/must-gather-5cn2v" podStartSLOduration=2.255803396 podStartE2EDuration="3.139089086s" podCreationTimestamp="2026-04-17 08:10:13 +0000 UTC" firstStartedPulling="2026-04-17 08:10:14.309065938 +0000 UTC m=+457.007022582" lastFinishedPulling="2026-04-17 08:10:15.192351615 +0000 UTC m=+457.890308272" observedRunningTime="2026-04-17 08:10:16.137760081 +0000 UTC m=+458.835716748" watchObservedRunningTime="2026-04-17 08:10:16.139089086 +0000 UTC m=+458.837045761" Apr 17 08:10:16.570087 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:16.570048 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6zncl_28077202-06dd-4ed2-862a-f70c6f35f820/global-pull-secret-syncer/0.log" Apr 17 08:10:16.699999 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:16.699958 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ttxzs_36159bce-5e37-46a6-b216-a4bc0a7e38a8/konnectivity-agent/0.log" Apr 17 08:10:16.718714 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:16.718677 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-245.ec2.internal_d0c56bbacc967c418e4c91e68e0ba0d3/haproxy/0.log" Apr 17 08:10:20.062251 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:20.062190 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6brf_ae5e4541-4bb5-425b-a511-1291960981fc/node-exporter/0.log" Apr 17 08:10:20.086562 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:20.086463 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6brf_ae5e4541-4bb5-425b-a511-1291960981fc/kube-rbac-proxy/0.log" Apr 17 08:10:20.110430 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:20.110395 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b6brf_ae5e4541-4bb5-425b-a511-1291960981fc/init-textfile/0.log" Apr 17 08:10:23.456801 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.456744 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx"] Apr 17 08:10:23.461341 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.461310 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.468367 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.468329 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx"] Apr 17 08:10:23.620574 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.620540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-sys\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.620574 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.620578 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-podres\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.620792 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.620599 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rkk\" (UniqueName: \"kubernetes.io/projected/32ec9d03-16eb-4085-b576-a90cf521e868-kube-api-access-f2rkk\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.620792 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.620677 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-lib-modules\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.620792 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.620716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-proc\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722145 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-proc\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722294 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-sys\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722294 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722178 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-proc\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722294 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722197 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-podres\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722294 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rkk\" (UniqueName: \"kubernetes.io/projected/32ec9d03-16eb-4085-b576-a90cf521e868-kube-api-access-f2rkk\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722294 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722264 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-lib-modules\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722294 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722277 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-sys\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722494 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722316 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-podres\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.722494 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.722393 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32ec9d03-16eb-4085-b576-a90cf521e868-lib-modules\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.730319 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.730252 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rkk\" (UniqueName: \"kubernetes.io/projected/32ec9d03-16eb-4085-b576-a90cf521e868-kube-api-access-f2rkk\") pod \"perf-node-gather-daemonset-7hznx\" (UID: \"32ec9d03-16eb-4085-b576-a90cf521e868\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.775077 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.775040 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:23.857041 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.856980 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zbswg_e555087a-130b-4ab0-aaa8-92c983ed7e0b/dns/0.log" Apr 17 08:10:23.880181 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.880155 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zbswg_e555087a-130b-4ab0-aaa8-92c983ed7e0b/kube-rbac-proxy/0.log" Apr 17 08:10:23.897147 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.897102 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx"] Apr 17 08:10:23.900016 ip-10-0-128-245 kubenswrapper[2580]: W0417 08:10:23.899986 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod32ec9d03_16eb_4085_b576_a90cf521e868.slice/crio-f582a68854cd5f410168ffc4fd5390d2270f32cebe2f1295486d1f1df641591d WatchSource:0}: Error finding container f582a68854cd5f410168ffc4fd5390d2270f32cebe2f1295486d1f1df641591d: Status 404 returned error can't find the container with id f582a68854cd5f410168ffc4fd5390d2270f32cebe2f1295486d1f1df641591d Apr 17 08:10:23.963023 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:23.962995 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sv4zj_62832cab-fee8-498a-b9dd-d410e9f3e921/dns-node-resolver/0.log" Apr 17 08:10:24.151326 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:24.151292 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" event={"ID":"32ec9d03-16eb-4085-b576-a90cf521e868","Type":"ContainerStarted","Data":"5570f5a233650bd63cf84817d23ebb78ac8de3bc03e0bbc665b65eb4c0013191"} Apr 17 08:10:24.151326 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:24.151329 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" event={"ID":"32ec9d03-16eb-4085-b576-a90cf521e868","Type":"ContainerStarted","Data":"f582a68854cd5f410168ffc4fd5390d2270f32cebe2f1295486d1f1df641591d"} Apr 17 08:10:24.151545 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:24.151407 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:24.165641 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:24.165586 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" podStartSLOduration=1.165571258 podStartE2EDuration="1.165571258s" podCreationTimestamp="2026-04-17 08:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:10:24.164076692 +0000 UTC m=+466.862033357" watchObservedRunningTime="2026-04-17 08:10:24.165571258 +0000 UTC m=+466.863527928" Apr 17 08:10:24.328273 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:24.328180 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8c884d54b-ths49_cff45cdf-b293-4f29-9fc9-ac02d70f69dc/registry/0.log" Apr 17 08:10:24.329776 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:24.329749 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8c884d54b-ths49_cff45cdf-b293-4f29-9fc9-ac02d70f69dc/registry/1.log" Apr 17 08:10:24.348330 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:24.348296 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lf6p6_d06f7085-abd8-4770-bc98-8794c5f1a056/node-ca/0.log" Apr 17 08:10:25.351823 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:25.351785 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gj4s7_e1d27989-b735-4d7f-b801-7b81443d7ba7/serve-healthcheck-canary/0.log" Apr 17 08:10:25.805784 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:25.805751 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-td59z_794bfcee-a6d5-46eb-81ca-623c6c1871af/kube-rbac-proxy/0.log" Apr 17 08:10:25.825676 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:25.825646 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-td59z_794bfcee-a6d5-46eb-81ca-623c6c1871af/exporter/0.log" Apr 17 08:10:25.846598 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:25.846558 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-td59z_794bfcee-a6d5-46eb-81ca-623c6c1871af/extractor/0.log" Apr 17 08:10:27.458803 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:27.458774 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-bw9b5_051f4148-f469-4379-a909-ef3291283df0/jobset-operator/0.log" Apr 17 08:10:30.166823 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:30.166787 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-7hznx" Apr 17 08:10:31.579766 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:31.579738 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7p6p_18d4abe2-95b8-4158-acde-3d01b4526f60/kube-multus-additional-cni-plugins/0.log" Apr 17 08:10:31.603042 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:31.603000 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7p6p_18d4abe2-95b8-4158-acde-3d01b4526f60/egress-router-binary-copy/0.log" Apr 17 08:10:31.624996 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:31.624969 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7p6p_18d4abe2-95b8-4158-acde-3d01b4526f60/cni-plugins/0.log" Apr 17 08:10:31.648098 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:31.648063 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7p6p_18d4abe2-95b8-4158-acde-3d01b4526f60/bond-cni-plugin/0.log" Apr 17 08:10:31.669862 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:31.669831 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7p6p_18d4abe2-95b8-4158-acde-3d01b4526f60/routeoverride-cni/0.log" Apr 17 08:10:31.692961 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:31.692927 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7p6p_18d4abe2-95b8-4158-acde-3d01b4526f60/whereabouts-cni-bincopy/0.log" Apr 17 08:10:31.715902 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:31.715873 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7p6p_18d4abe2-95b8-4158-acde-3d01b4526f60/whereabouts-cni/0.log" Apr 17 08:10:32.136874 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:32.136840 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r995s_abf436c1-3b8e-4c83-b4e5-4cae8c04c259/kube-multus/0.log" Apr 17 08:10:32.211589 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:32.211544 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r9td5_5c94e060-29ca-49bd-9d62-210b4628adef/network-metrics-daemon/0.log" Apr 17 08:10:32.233289 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:32.233256 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r9td5_5c94e060-29ca-49bd-9d62-210b4628adef/kube-rbac-proxy/0.log" Apr 17 08:10:33.593624 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:33.593593 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wwcgr_02718710-e78f-45e5-97ee-f802acc6c063/ovn-controller/0.log" Apr 17 08:10:33.616964 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:33.616930 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wwcgr_02718710-e78f-45e5-97ee-f802acc6c063/ovn-acl-logging/0.log" Apr 17 08:10:33.639109 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:33.639074 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wwcgr_02718710-e78f-45e5-97ee-f802acc6c063/kube-rbac-proxy-node/0.log" Apr 17 08:10:33.660743 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:33.660714 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wwcgr_02718710-e78f-45e5-97ee-f802acc6c063/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:10:33.677525 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:33.677494 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wwcgr_02718710-e78f-45e5-97ee-f802acc6c063/northd/0.log" Apr 17 08:10:33.697576 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:33.697547 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wwcgr_02718710-e78f-45e5-97ee-f802acc6c063/nbdb/0.log" Apr 17 08:10:33.717620 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:33.717587 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wwcgr_02718710-e78f-45e5-97ee-f802acc6c063/sbdb/0.log" Apr 17 08:10:33.911817 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:33.911789 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wwcgr_02718710-e78f-45e5-97ee-f802acc6c063/ovnkube-controller/0.log" Apr 17 08:10:34.922198 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:34.922168 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bhnbj_2e9bd659-98fd-4a44-bf4a-2d4b8eb7ffb2/network-check-target-container/0.log" Apr 17 08:10:35.755557 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:35.755525 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-pr74b_e95c47f8-d745-42a4-8a4b-3f83ef6805b8/iptables-alerter/0.log" Apr 17 08:10:36.370267 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:36.370236 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-clh84_73847dfb-9da7-48a8-9c86-58744827d1a8/tuned/0.log" Apr 17 08:10:39.523109 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:39.523081 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-5vvsg_85251bca-2387-47f1-892a-cf015be5673d/csi-driver/0.log" Apr 17 08:10:39.548204 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:39.548165 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-5vvsg_85251bca-2387-47f1-892a-cf015be5673d/csi-node-driver-registrar/0.log" Apr 17 08:10:39.570332 ip-10-0-128-245 kubenswrapper[2580]: I0417 08:10:39.570302 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-5vvsg_85251bca-2387-47f1-892a-cf015be5673d/csi-liveness-probe/0.log"