Apr 16 10:04:35.096796 ip-10-0-143-196 systemd[1]: Starting Kubernetes Kubelet... Apr 16 10:04:35.563474 ip-10-0-143-196 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:04:35.563474 ip-10-0-143-196 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 10:04:35.563474 ip-10-0-143-196 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:04:35.563474 ip-10-0-143-196 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 10:04:35.563474 ip-10-0-143-196 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 10:04:35.566992 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.566877 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 10:04:35.570147 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570132 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:35.570147 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570147 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570151 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570155 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570159 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570162 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570166 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570169 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570171 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570174 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570177 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570180 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570183 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570185 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570188 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570190 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570193 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570195 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570198 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570200 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570203 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:35.570208 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570205 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570208 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570211 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570214 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570217 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570225 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570228 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570230 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570233 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570235 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570238 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570241 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570244 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570246 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570249 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570251 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570254 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570256 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570259 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570262 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:35.570722 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570264 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570267 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570269 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570273 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570276 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570278 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570281 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570283 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570286 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570289 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570291 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570294 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570296 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570299 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570302 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570306 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570308 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570311 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570314 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570316 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:35.571241 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570319 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570321 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570324 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570326 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570329 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570331 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570334 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570336 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570339 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570343 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570347 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570350 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570354 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570357 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570360 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570363 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570366 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570369 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570373 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:35.571808 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570376 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570378 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570381 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570384 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570386 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570392 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570828 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570835 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570838 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570841 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570844 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570847 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570849 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570852 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570854 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570857 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570860 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570862 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570865 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570867 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570870 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:35.572287 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570873 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570876 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570878 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570881 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570883 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570886 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570889 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570891 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570894 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570896 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570899 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570901 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570904 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570906 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570909 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570911 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570914 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570917 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570920 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570923 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:35.572845 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570926 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570929 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570932 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570936 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570940 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570943 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570946 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570948 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570951 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570953 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570956 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570958 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570961 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570963 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570966 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570969 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570972 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570974 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570977 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:35.573374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570979 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570982 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570984 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570987 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570989 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570992 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570994 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570996 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.570999 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571002 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571004 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571007 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571010 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571012 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571015 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571018 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571021 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571024 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571026 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571029 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:35.573881 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571031 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571034 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571036 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571039 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571043 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571046 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571049 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571052 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571056 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571059 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571061 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.571064 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571936 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571951 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571959 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571963 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571968 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571971 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571976 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571980 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 10:04:35.574374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571984 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571988 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571992 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571995 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.571999 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572002 2570 flags.go:64] FLAG: --cgroup-root="" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572005 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572008 2570 flags.go:64] FLAG: --client-ca-file="" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572011 2570 flags.go:64] FLAG: --cloud-config="" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572022 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572026 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572030 2570 flags.go:64] FLAG: --cluster-domain="" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572033 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572036 2570 flags.go:64] FLAG: --config-dir="" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572039 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572043 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572048 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572051 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572054 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572057 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572060 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572064 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572067 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572071 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572073 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 10:04:35.574877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572078 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572081 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572084 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572089 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572092 2570 flags.go:64] FLAG: --enable-server="true" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572095 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572100 2570 flags.go:64] FLAG: --event-burst="100" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572103 2570 flags.go:64] FLAG: --event-qps="50" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572107 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572110 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572113 2570 flags.go:64] FLAG: --eviction-hard="" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572117 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572120 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572123 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572126 2570 flags.go:64] FLAG: --eviction-soft="" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572129 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572132 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572135 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572138 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572141 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572144 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572147 2570 flags.go:64] FLAG: --feature-gates="" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572150 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572153 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572156 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 10:04:35.575473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572159 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572162 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572165 2570 flags.go:64] FLAG: --help="false" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572169 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-143-196.ec2.internal" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572173 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572176 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572179 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572183 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572187 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572190 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572194 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572197 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572200 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572203 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572206 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572209 2570 flags.go:64] FLAG: --kube-reserved="" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572212 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572215 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572218 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572221 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572223 2570 flags.go:64] FLAG: --lock-file="" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572226 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572229 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572232 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 10:04:35.576115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572237 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572240 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572243 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572246 2570 flags.go:64] FLAG: --logging-format="text" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572249 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572252 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572255 2570 flags.go:64] FLAG: --manifest-url="" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572258 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572262 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572265 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572269 2570 flags.go:64] FLAG: --max-pods="110" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572272 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572275 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572280 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572283 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572286 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572289 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572293 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572301 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572305 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572308 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572311 2570 flags.go:64] FLAG: --pod-cidr="" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572314 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 10:04:35.576704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572319 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572322 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572325 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572328 2570 flags.go:64] FLAG: --port="10250" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572331 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572334 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00d9da74ee385bc5d" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572337 2570 flags.go:64] FLAG: --qos-reserved="" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572341 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572344 2570 flags.go:64] FLAG: --register-node="true" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572346 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572349 2570 flags.go:64] FLAG: --register-with-taints="" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572353 2570 flags.go:64] FLAG: --registry-burst="10" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572356 2570 flags.go:64] FLAG: --registry-qps="5" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572359 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572362 2570 flags.go:64] FLAG: --reserved-memory="" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572365 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572368 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572371 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572374 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572377 2570 flags.go:64] FLAG: --runonce="false" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572380 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572383 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572386 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572389 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572395 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572398 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 10:04:35.577255 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572402 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572406 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572409 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572412 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572415 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572417 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572420 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572423 2570 flags.go:64] FLAG: --system-cgroups="" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572426 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572432 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572435 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572438 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572442 2570 flags.go:64] FLAG: --tls-min-version="" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572445 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572448 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572450 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572453 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572456 2570 flags.go:64] FLAG: --v="2" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572460 2570 flags.go:64] FLAG: --version="false" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572464 2570 flags.go:64] FLAG: --vmodule="" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572469 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572472 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572596 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572601 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:35.577907 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572605 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572608 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572611 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572614 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572617 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572620 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572626 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572629 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572632 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572635 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572637 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572640 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572642 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572645 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572648 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572650 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572653 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572655 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572658 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:35.578492 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572661 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572664 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572666 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572669 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572671 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572674 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572676 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572679 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572682 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572684 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572687 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572690 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572692 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572695 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572697 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572700 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572702 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572705 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572708 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:35.579020 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572712 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572717 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572720 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572723 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572726 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572728 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572731 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572733 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572736 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572738 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572741 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572744 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572746 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572749 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572751 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572754 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572756 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572759 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572761 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572764 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:35.579476 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572767 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572769 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572772 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572774 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572777 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572779 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572782 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572785 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572787 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572790 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572793 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572795 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572799 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572802 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572805 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572808 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572811 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572813 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572816 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572818 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:35.579980 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572821 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572823 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572826 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572829 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572831 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.572834 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.572839 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.579809 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.579827 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579875 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579880 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579884 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579887 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579891 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579894 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579897 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:35.580487 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579900 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579902 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579905 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579907 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579910 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579912 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579915 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579918 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579921 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579924 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579926 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579929 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579932 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579935 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579937 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579940 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579943 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579946 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579948 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579951 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:35.580903 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579954 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579957 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579959 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579962 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579965 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579968 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579971 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579973 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579976 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579979 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579982 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579984 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579987 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579990 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579993 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579995 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.579998 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580001 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580004 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580006 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:35.581381 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580009 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580018 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580022 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580025 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580028 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580031 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580034 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580036 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580040 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580045 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580050 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580053 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580056 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580060 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580064 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580067 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580070 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580073 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580076 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:35.581885 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580079 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580081 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580084 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580087 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580089 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580092 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580095 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580097 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580100 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580103 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580106 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580108 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580111 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580113 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580116 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580119 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580121 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580124 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580126 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:35.582405 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580129 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.580134 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580231 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580235 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580238 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580241 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580244 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580247 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580250 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580254 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580257 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580261 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580265 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580268 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580270 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 10:04:35.582891 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580273 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580276 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580278 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580281 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580284 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580286 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580290 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580294 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580297 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580300 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580303 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580305 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580308 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580311 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580313 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580316 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580318 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580321 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580323 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 10:04:35.583264 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580343 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580347 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580350 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580353 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580356 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580359 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580361 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580364 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580367 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580370 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580373 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580376 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580379 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580381 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580384 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580387 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580390 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580392 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580395 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580398 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 10:04:35.583810 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580401 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580403 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580406 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580408 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580411 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580413 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580416 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580418 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580421 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580423 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580426 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580428 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580431 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580433 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580436 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580438 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580441 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580443 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580446 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580449 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 10:04:35.584313 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580451 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580454 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580457 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580459 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580465 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580467 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580470 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580472 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580475 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580477 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580480 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580483 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580485 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:35.580487 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.580492 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 10:04:35.584941 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.581217 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 10:04:35.585316 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.583323 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 10:04:35.585316 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.584396 2570 server.go:1019] "Starting client certificate rotation" Apr 16 10:04:35.585316 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.584507 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 10:04:35.585316 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.584577 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 10:04:35.611111 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.611085 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 10:04:35.614720 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.614355 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 10:04:35.633054 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.633033 2570 log.go:25] "Validated CRI v1 runtime API" Apr 16 10:04:35.638557 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.638523 2570 log.go:25] "Validated CRI v1 image API" Apr 16 10:04:35.639724 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.639703 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 10:04:35.639938 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.639923 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 10:04:35.646428 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.646408 2570 fs.go:135] Filesystem UUIDs: map[4e12dd5f-61d5-4fce-97c9-4698e5cd0385:/dev/nvme0n1p3 76fade80-4706-4afd-be04-3dffc1881b76:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 10:04:35.646473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.646428 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 10:04:35.652704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.652601 2570 manager.go:217] Machine: {Timestamp:2026-04-16 10:04:35.650513647 +0000 UTC m=+0.430922824 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099291 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23fc600bcf6d4db2462fc224988d75 SystemUUID:ec23fc60-0bcf-6d4d-b246-2fc224988d75 BootID:53d7086c-76a4-4bbe-93b6-047ddebc1f2f Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:09:83:05:d3:f5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:09:83:05:d3:f5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d6:44:6c:b4:26:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 10:04:35.652704 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.652700 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 10:04:35.652804 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.652777 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 10:04:35.653875 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.653851 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 10:04:35.654007 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.653878 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-196.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 10:04:35.654053 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.654016 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 10:04:35.654053 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.654025 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 10:04:35.654053 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.654038 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 10:04:35.655567 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.655556 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 10:04:35.656960 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.656949 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 10:04:35.657073 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.657064 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 10:04:35.659932 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.659921 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 16 10:04:35.659965 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.659948 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 10:04:35.659965 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.659963 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 10:04:35.660022 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.659971 2570 kubelet.go:397] "Adding apiserver pod source" Apr 16 10:04:35.660022 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.659987 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 10:04:35.661158 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.661146 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 10:04:35.661210 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.661164 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 10:04:35.664304 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.664277 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 10:04:35.666309 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.666297 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 10:04:35.667846 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667833 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667849 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667855 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667861 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667866 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667875 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667884 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667891 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667897 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667903 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 10:04:35.667912 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667911 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 10:04:35.668211 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.667920 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 10:04:35.668931 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.668880 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 10:04:35.669162 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.669150 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 10:04:35.673573 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.673551 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 10:04:35.673666 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.673610 2570 server.go:1295] "Started kubelet" Apr 16 10:04:35.673730 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.673686 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 10:04:35.673790 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.673756 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 10:04:35.673845 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.673808 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 10:04:35.674400 ip-10-0-143-196 systemd[1]: Started Kubernetes Kubelet. Apr 16 10:04:35.674510 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.674445 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 10:04:35.674510 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.674486 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-196.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 10:04:35.674640 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.674577 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-196.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 10:04:35.675160 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.675145 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 10:04:35.676313 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.676297 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 16 10:04:35.681176 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.679983 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-196.ec2.internal.18a6ce3f9e0421e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-196.ec2.internal,UID:ip-10-0-143-196.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-196.ec2.internal,},FirstTimestamp:2026-04-16 10:04:35.673571813 +0000 UTC m=+0.453980993,LastTimestamp:2026-04-16 10:04:35.673571813 +0000 UTC m=+0.453980993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-196.ec2.internal,}" Apr 16 10:04:35.682120 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.682097 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 10:04:35.682237 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.682221 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 10:04:35.682665 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.682650 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 10:04:35.683102 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683084 2570 factory.go:55] Registering systemd factory Apr 16 10:04:35.683102 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683104 2570 factory.go:223] Registration of the systemd container factory successfully Apr 16 10:04:35.683297 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683286 2570 factory.go:153] Registering CRI-O factory Apr 16 10:04:35.683357 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683298 2570 factory.go:223] Registration of the crio container factory successfully Apr 16 10:04:35.683357 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683331 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 10:04:35.683357 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683333 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 10:04:35.683357 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683340 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 10:04:35.683357 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683357 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 10:04:35.683617 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683369 2570 factory.go:103] Registering Raw factory Apr 16 10:04:35.683617 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683381 2570 manager.go:1196] Started watching for new ooms in manager Apr 16 10:04:35.683617 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683428 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 16 10:04:35.683617 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.683435 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 16 10:04:35.683617 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.683566 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:35.684624 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.684608 2570 manager.go:319] Starting recovery of all containers Apr 16 10:04:35.684712 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.684655 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x2wgw" Apr 16 10:04:35.693116 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.693090 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x2wgw" Apr 16 10:04:35.693890 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.693868 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 10:04:35.694005 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.693986 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-196.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 10:04:35.694077 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.694047 2570 manager.go:324] Recovery completed Apr 16 10:04:35.699952 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.699940 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:35.702520 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.702504 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:35.702611 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.702553 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:35.702611 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.702569 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:35.703064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.703049 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 10:04:35.703112 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.703065 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 10:04:35.703112 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.703083 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 10:04:35.705790 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.705777 2570 policy_none.go:49] "None policy: Start" Apr 16 10:04:35.705790 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.705792 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 10:04:35.705910 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.705802 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 16 10:04:35.747234 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.747090 2570 manager.go:341] "Starting Device Plugin manager" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.747272 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.747287 2570 server.go:85] "Starting device plugin registration server" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.747549 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.747559 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.747678 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.747762 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.747772 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.748230 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 10:04:35.759809 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.748262 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:35.792722 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.792686 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 10:04:35.793883 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.793867 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 10:04:35.793944 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.793898 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 10:04:35.793944 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.793921 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 10:04:35.793944 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.793930 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 10:04:35.794050 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.793969 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 10:04:35.797201 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.797184 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:35.847895 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.847845 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:35.848630 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.848614 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:35.848699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.848646 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:35.848699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.848657 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:35.848699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.848679 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-196.ec2.internal" Apr 16 10:04:35.855322 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.855308 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-196.ec2.internal" Apr 16 10:04:35.855393 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.855328 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-196.ec2.internal\": node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:35.878893 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.878876 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:35.894314 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.894291 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal"] Apr 16 10:04:35.894420 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.894363 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:35.895818 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.895802 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:35.895914 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.895835 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:35.895914 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.895849 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:35.897143 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897128 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:35.897261 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897247 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:35.897311 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897276 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:35.897819 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897803 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:35.897909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897804 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:35.897909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897829 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:35.897909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897892 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:35.897909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897901 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:35.898087 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.897912 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:35.899685 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.899670 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" Apr 16 10:04:35.899796 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.899692 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 10:04:35.900266 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.900251 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientMemory" Apr 16 10:04:35.900333 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.900279 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 10:04:35.900333 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:35.900293 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeHasSufficientPID" Apr 16 10:04:35.928081 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.928063 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-196.ec2.internal\" not found" node="ip-10-0-143-196.ec2.internal" Apr 16 10:04:35.932621 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.932606 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-196.ec2.internal\" not found" node="ip-10-0-143-196.ec2.internal" Apr 16 10:04:35.979147 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:35.979120 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.080217 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.080177 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.084514 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.084495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df50e37f169a5ef9261defc9686381a1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal\" (UID: \"df50e37f169a5ef9261defc9686381a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.084628 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.084547 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9c721cdecb4375340a2fbe75779b609c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-196.ec2.internal\" (UID: \"9c721cdecb4375340a2fbe75779b609c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.084628 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.084576 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/df50e37f169a5ef9261defc9686381a1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal\" (UID: \"df50e37f169a5ef9261defc9686381a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.180928 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.180855 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.185230 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.185210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/df50e37f169a5ef9261defc9686381a1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal\" (UID: \"df50e37f169a5ef9261defc9686381a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.185285 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.185238 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df50e37f169a5ef9261defc9686381a1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal\" (UID: \"df50e37f169a5ef9261defc9686381a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.185285 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.185257 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9c721cdecb4375340a2fbe75779b609c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-196.ec2.internal\" (UID: \"9c721cdecb4375340a2fbe75779b609c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.185350 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.185286 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df50e37f169a5ef9261defc9686381a1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal\" (UID: \"df50e37f169a5ef9261defc9686381a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.185350 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.185295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/df50e37f169a5ef9261defc9686381a1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal\" (UID: \"df50e37f169a5ef9261defc9686381a1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.185350 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.185321 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9c721cdecb4375340a2fbe75779b609c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-196.ec2.internal\" (UID: \"9c721cdecb4375340a2fbe75779b609c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.231307 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.231282 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.234988 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.234971 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" Apr 16 10:04:36.281100 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.281066 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.381623 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.381578 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.482133 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.482042 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.582580 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.582546 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.583684 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.583670 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 10:04:36.583817 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.583802 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 10:04:36.634696 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.634672 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:36.676724 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.676702 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:36.682983 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.682956 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.682983 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.682966 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 10:04:36.695182 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.695159 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 10:04:36.695275 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.695243 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 09:59:35 +0000 UTC" deadline="2027-10-25 13:57:18.498699052 +0000 UTC" Apr 16 10:04:36.695275 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.695260 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13371h52m41.803440899s" Apr 16 10:04:36.728191 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.728171 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-j4khq" Apr 16 10:04:36.733850 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.733803 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-j4khq" Apr 16 10:04:36.783167 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.783143 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.853272 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:36.853240 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c721cdecb4375340a2fbe75779b609c.slice/crio-560a8e8e2493cfd6452f5d0d18719aacb1e210a7e91ebd5992fd0c2a55359323 WatchSource:0}: Error finding container 560a8e8e2493cfd6452f5d0d18719aacb1e210a7e91ebd5992fd0c2a55359323: Status 404 returned error can't find the container with id 560a8e8e2493cfd6452f5d0d18719aacb1e210a7e91ebd5992fd0c2a55359323 Apr 16 10:04:36.857899 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:36.857884 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:04:36.883604 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.883580 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:36.983967 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:36.983906 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:37.084344 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.084324 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:37.185090 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.185060 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-196.ec2.internal\" not found" Apr 16 10:04:37.271568 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.271480 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:37.283103 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.283078 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" Apr 16 10:04:37.297820 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.297801 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 10:04:37.299112 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.299100 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" Apr 16 10:04:37.305575 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.305558 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 10:04:37.339233 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:37.339205 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf50e37f169a5ef9261defc9686381a1.slice/crio-f9556a50cfbc757178a478573800fccbe128e38bf8de27cc865647c491d97327 WatchSource:0}: Error finding container f9556a50cfbc757178a478573800fccbe128e38bf8de27cc865647c491d97327: Status 404 returned error can't find the container with id f9556a50cfbc757178a478573800fccbe128e38bf8de27cc865647c491d97327 Apr 16 10:04:37.456203 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.456167 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 10:04:37.661276 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.661206 2570 apiserver.go:52] "Watching apiserver" Apr 16 10:04:37.670589 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.670571 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 10:04:37.670922 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.670903 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-8hjnc","openshift-image-registry/node-ca-v8sh7","openshift-multus/multus-additional-cni-plugins-v9smg","openshift-multus/multus-hhvjb","openshift-multus/network-metrics-daemon-9mqb2","openshift-ovn-kubernetes/ovnkube-node-tmhhw","kube-system/konnectivity-agent-s7wkr","kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2","openshift-dns/node-resolver-rj4q8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal","openshift-network-diagnostics/network-check-target-2h7wk","openshift-network-operator/iptables-alerter-g9ftj"] Apr 16 10:04:37.673937 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.673921 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:04:37.676156 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.676137 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 10:04:37.676253 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.676178 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-tglkg\"" Apr 16 10:04:37.676315 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.676266 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 10:04:37.676315 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.676268 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.677559 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.677508 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.677668 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.677626 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:37.677770 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.677720 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:37.678318 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.678299 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 10:04:37.678419 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.678298 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 10:04:37.678419 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.678383 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 10:04:37.678601 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.678585 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 10:04:37.678601 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.678597 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 10:04:37.678735 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.678716 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fdrpx\"" Apr 16 10:04:37.679061 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.679039 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.679566 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.679549 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qntdr\"" Apr 16 10:04:37.679674 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.679555 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 10:04:37.680306 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.680288 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:37.680379 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.680348 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:37.681212 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.681193 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 10:04:37.681317 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.681285 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 10:04:37.681396 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.681374 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 10:04:37.681546 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.681517 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nrs6w\"" Apr 16 10:04:37.681630 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.681606 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 10:04:37.681685 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.681649 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 10:04:37.681795 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.681778 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 10:04:37.682148 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.682129 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.683340 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.683319 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.684239 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.684211 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 10:04:37.685330 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.684584 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 10:04:37.685330 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.684822 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7nsgk\"" Apr 16 10:04:37.685330 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.684866 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 10:04:37.685735 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.685694 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:04:37.686480 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.686134 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.686724 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.686704 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 10:04:37.686825 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.686806 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dg9zt\"" Apr 16 10:04:37.688125 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.688106 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 10:04:37.688369 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.688351 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.688464 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.688352 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 10:04:37.688761 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.688747 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2kc5f\"" Apr 16 10:04:37.688761 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.688754 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 10:04:37.689877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.689860 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.690191 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.690173 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sb49x\"" Apr 16 10:04:37.690289 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.690224 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 10:04:37.690289 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.690238 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 10:04:37.691919 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.691902 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 10:04:37.692075 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.691986 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 10:04:37.692075 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692017 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-log-socket\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.692075 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692051 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23649a52-557a-477c-838c-84b209078bbb-ovn-node-metrics-cert\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.692287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692092 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a61fb09-0793-4b69-b34c-784caf0249e5-serviceca\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.692287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692117 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a61fb09-0793-4b69-b34c-784caf0249e5-host\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.692287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692141 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-cni-multus\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.692287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692192 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:37.692287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbsv\" (UniqueName: \"kubernetes.io/projected/bccbed26-7fad-44dd-b120-4fe2758154e5-kube-api-access-rzbsv\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:37.692287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692237 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.692287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692253 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-cnibin\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.692287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692278 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-os-release\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692305 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ql5r\" (UniqueName: \"kubernetes.io/projected/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-kube-api-access-8ql5r\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-kubelet-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692346 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jn6ct\"" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692349 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf-agent-certs\") pod \"konnectivity-agent-s7wkr\" (UID: \"ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf\") " pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692417 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22b85402-76c2-472c-90f0-25a54604bbb9-multus-daemon-config\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692465 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-socket-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692490 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-conf-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692514 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zlc4\" (UniqueName: \"kubernetes.io/projected/22b85402-76c2-472c-90f0-25a54604bbb9-kube-api-access-2zlc4\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692558 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-var-lib-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692577 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-cni-netd\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692592 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf-konnectivity-ca\") pod \"konnectivity-agent-s7wkr\" (UID: \"ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf\") " pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-cni-bin\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-sys-fs\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692670 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-kubelet\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.692688 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692691 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-ovnkube-config\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692710 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-systemd-units\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692727 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692741 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxwr\" (UniqueName: \"kubernetes.io/projected/4a61fb09-0793-4b69-b34c-784caf0249e5-kube-api-access-qdxwr\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692775 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjn79\" (UniqueName: \"kubernetes.io/projected/e3597b6a-0a19-49d2-8f48-59d7320d0993-kube-api-access-qjn79\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692810 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-socket-dir-parent\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692834 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692857 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-device-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692894 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-system-cni-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692917 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-cni-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692948 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-os-release\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.692984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-hostroot\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693009 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-run-netns\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693034 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-ovnkube-script-lib\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693083 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.693548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693134 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-systemd\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693183 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-cni-bin\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693233 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-kubernetes\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693253 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-run\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693271 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-sys\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693284 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-host\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693330 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-etc-selinux\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693373 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-etc-kubernetes\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693397 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-env-overrides\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693448 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysctl-d\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693472 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-systemd\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693498 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-tmp\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693548 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-registration-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-cnibin\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693616 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-kubelet\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-node-log\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.694259 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693662 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rtm\" (UniqueName: \"kubernetes.io/projected/23649a52-557a-477c-838c-84b209078bbb-kube-api-access-z7rtm\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693686 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-system-cni-dir\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693711 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysconfig\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693733 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22b85402-76c2-472c-90f0-25a54604bbb9-cni-binary-copy\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693754 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-etc-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-ovn\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693800 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysctl-conf\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693822 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-var-lib-kubelet\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-tuned\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-netns\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693894 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-multus-certs\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693925 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-slash\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693947 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-lib-modules\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693969 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-k8s-cni-cncf-io\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.693993 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.694033 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwnzv\" (UniqueName: \"kubernetes.io/projected/191a958c-a1a7-4e33-9456-1f482a72fb5e-kube-api-access-dwnzv\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.695190 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.694072 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-modprobe-d\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.735652 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.735619 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 09:59:36 +0000 UTC" deadline="2027-12-23 09:42:34.282599168 +0000 UTC" Apr 16 10:04:37.735652 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.735652 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14783h37m56.546951039s" Apr 16 10:04:37.784277 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.784250 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 10:04:37.795020 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.794996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-kubelet\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795020 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795024 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-ovnkube-config\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795207 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795039 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-systemd-units\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795207 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795064 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.795207 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795099 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-kubelet\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795207 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795148 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-systemd-units\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795392 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795245 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxwr\" (UniqueName: \"kubernetes.io/projected/4a61fb09-0793-4b69-b34c-784caf0249e5-kube-api-access-qdxwr\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.795392 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795276 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjn79\" (UniqueName: \"kubernetes.io/projected/e3597b6a-0a19-49d2-8f48-59d7320d0993-kube-api-access-qjn79\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.795392 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-socket-dir-parent\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.795392 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795326 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:37.795392 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-device-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.795392 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795373 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-system-cni-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795399 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-cni-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-os-release\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-hostroot\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795466 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-run-netns\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795509 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-ovnkube-script-lib\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795600 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-systemd\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795631 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-cni-bin\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795657 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-kubernetes\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-run\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.795699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795693 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795717 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-ovnkube-config\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-sys\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795777 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-sys\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795775 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-host\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795813 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-etc-selinux\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795845 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-hosts-file\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-etc-kubernetes\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795894 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-env-overrides\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysctl-d\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795954 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-host\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.795968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-systemd\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796001 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-tmp\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-registration-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-cnibin\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-kubelet\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796108 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-device-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.796262 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796117 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-node-log\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796124 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-socket-dir-parent\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7rtm\" (UniqueName: \"kubernetes.io/projected/23649a52-557a-477c-838c-84b209078bbb-kube-api-access-z7rtm\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796166 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-node-log\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796160 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-run-netns\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796185 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-system-cni-dir\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796212 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysconfig\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796237 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22b85402-76c2-472c-90f0-25a54604bbb9-cni-binary-copy\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796261 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-etc-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-ovn\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796293 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-cni-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysctl-conf\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796327 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-os-release\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-var-lib-kubelet\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796343 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-system-cni-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-tuned\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796551 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-kubelet\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796589 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-ovn\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797064 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796621 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-etc-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-cni-bin\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-kubernetes\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796693 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-ovnkube-script-lib\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796698 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-run\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796727 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-hostroot\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-systemd\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796766 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysctl-conf\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-etc-selinux\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-system-cni-dir\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796850 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-etc-kubernetes\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796856 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq5gv\" (UniqueName: \"kubernetes.io/projected/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-kube-api-access-tq5gv\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796876 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54582370-009e-4add-bc9e-5a5c18069e72-host-slash\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-netns\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796916 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-multus-certs\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796941 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-slash\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796955 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-var-lib-kubelet\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-lib-modules\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.797840 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796945 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796999 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54582370-009e-4add-bc9e-5a5c18069e72-iptables-alerter-script\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796989 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysconfig\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.796781 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22b85402-76c2-472c-90f0-25a54604bbb9-cni-binary-copy\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-netns\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797059 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-registration-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797073 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-k8s-cni-cncf-io\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797090 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-systemd\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwnzv\" (UniqueName: \"kubernetes.io/projected/191a958c-a1a7-4e33-9456-1f482a72fb5e-kube-api-access-dwnzv\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-modprobe-d\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797179 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-sysctl-d\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-multus-certs\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797236 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797243 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-slash\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797243 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-lib-modules\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797273 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-cnibin\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.798588 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-tmp-dir\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797300 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797320 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-log-socket\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-modprobe-d\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797433 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23649a52-557a-477c-838c-84b209078bbb-ovn-node-metrics-cert\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/191a958c-a1a7-4e33-9456-1f482a72fb5e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797459 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-run-k8s-cni-cncf-io\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797462 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a61fb09-0793-4b69-b34c-784caf0249e5-serviceca\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797487 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23649a52-557a-477c-838c-84b209078bbb-env-overrides\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-log-socket\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a61fb09-0793-4b69-b34c-784caf0249e5-host\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797641 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-cni-multus\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797643 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797680 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a61fb09-0793-4b69-b34c-784caf0249e5-host\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797719 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797743 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-cni-multus\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbsv\" (UniqueName: \"kubernetes.io/projected/bccbed26-7fad-44dd-b120-4fe2758154e5-kube-api-access-rzbsv\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.797823 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:37.799399 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797878 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a61fb09-0793-4b69-b34c-784caf0249e5-serviceca\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.797922 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:38.297880523 +0000 UTC m=+3.078289707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.797983 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-cnibin\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798020 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-os-release\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798039 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-cnibin\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798094 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ql5r\" (UniqueName: \"kubernetes.io/projected/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-kube-api-access-8ql5r\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798114 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/191a958c-a1a7-4e33-9456-1f482a72fb5e-os-release\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-kubelet-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798151 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf-agent-certs\") pod \"konnectivity-agent-s7wkr\" (UID: \"ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf\") " pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798203 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22b85402-76c2-472c-90f0-25a54604bbb9-multus-daemon-config\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-kubelet-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798276 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-socket-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-run-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798315 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.800218 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798395 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-socket-dir\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798444 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8zlb\" (UniqueName: \"kubernetes.io/projected/54582370-009e-4add-bc9e-5a5c18069e72-kube-api-access-b8zlb\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798487 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-conf-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798555 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-multus-conf-dir\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798595 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zlc4\" (UniqueName: \"kubernetes.io/projected/22b85402-76c2-472c-90f0-25a54604bbb9-kube-api-access-2zlc4\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-var-lib-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798771 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22b85402-76c2-472c-90f0-25a54604bbb9-multus-daemon-config\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798831 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-var-lib-openvswitch\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798834 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-cni-netd\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf-konnectivity-ca\") pod \"konnectivity-agent-s7wkr\" (UID: \"ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf\") " pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798899 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/23649a52-557a-477c-838c-84b209078bbb-host-cni-netd\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798936 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-cni-bin\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-sys-fs\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.798994 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22b85402-76c2-472c-90f0-25a54604bbb9-host-var-lib-cni-bin\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.799085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e3597b6a-0a19-49d2-8f48-59d7320d0993-sys-fs\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.799353 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf-konnectivity-ca\") pod \"konnectivity-agent-s7wkr\" (UID: \"ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf\") " pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:04:37.800918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.800800 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" event={"ID":"df50e37f169a5ef9261defc9686381a1","Type":"ContainerStarted","Data":"f9556a50cfbc757178a478573800fccbe128e38bf8de27cc865647c491d97327"} Apr 16 10:04:37.801613 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.800950 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-etc-tuned\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.801613 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.801225 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-tmp\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.801613 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.801495 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf-agent-certs\") pod \"konnectivity-agent-s7wkr\" (UID: \"ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf\") " pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:04:37.801613 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.801598 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23649a52-557a-477c-838c-84b209078bbb-ovn-node-metrics-cert\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.801950 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.801923 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" event={"ID":"9c721cdecb4375340a2fbe75779b609c","Type":"ContainerStarted","Data":"560a8e8e2493cfd6452f5d0d18719aacb1e210a7e91ebd5992fd0c2a55359323"} Apr 16 10:04:37.805057 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.805035 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwnzv\" (UniqueName: \"kubernetes.io/projected/191a958c-a1a7-4e33-9456-1f482a72fb5e-kube-api-access-dwnzv\") pod \"multus-additional-cni-plugins-v9smg\" (UID: \"191a958c-a1a7-4e33-9456-1f482a72fb5e\") " pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.807051 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.806910 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:37.807051 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.806934 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:37.807051 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.806948 2570 projected.go:194] Error preparing data for projected volume kube-api-access-mhjtp for pod openshift-network-diagnostics/network-check-target-2h7wk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:37.807051 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:37.807007 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp podName:dba5a2c9-dd08-4be6-a76d-85742ada944e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:38.306990815 +0000 UTC m=+3.087399997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mhjtp" (UniqueName: "kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp") pod "network-check-target-2h7wk" (UID: "dba5a2c9-dd08-4be6-a76d-85742ada944e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:37.808764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.808461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxwr\" (UniqueName: \"kubernetes.io/projected/4a61fb09-0793-4b69-b34c-784caf0249e5-kube-api-access-qdxwr\") pod \"node-ca-v8sh7\" (UID: \"4a61fb09-0793-4b69-b34c-784caf0249e5\") " pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:37.808863 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.808782 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjn79\" (UniqueName: \"kubernetes.io/projected/e3597b6a-0a19-49d2-8f48-59d7320d0993-kube-api-access-qjn79\") pod \"aws-ebs-csi-driver-node-prdq2\" (UID: \"e3597b6a-0a19-49d2-8f48-59d7320d0993\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:37.809510 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.809486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7rtm\" (UniqueName: \"kubernetes.io/projected/23649a52-557a-477c-838c-84b209078bbb-kube-api-access-z7rtm\") pod \"ovnkube-node-tmhhw\" (UID: \"23649a52-557a-477c-838c-84b209078bbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:37.810614 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.810592 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zlc4\" (UniqueName: \"kubernetes.io/projected/22b85402-76c2-472c-90f0-25a54604bbb9-kube-api-access-2zlc4\") pod \"multus-hhvjb\" (UID: \"22b85402-76c2-472c-90f0-25a54604bbb9\") " pod="openshift-multus/multus-hhvjb" Apr 16 10:04:37.811136 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.811117 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbsv\" (UniqueName: \"kubernetes.io/projected/bccbed26-7fad-44dd-b120-4fe2758154e5-kube-api-access-rzbsv\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:37.811623 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.811605 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ql5r\" (UniqueName: \"kubernetes.io/projected/9fb1190b-eb18-4a5c-91c6-5bec59d57dc4-kube-api-access-8ql5r\") pod \"tuned-8hjnc\" (UID: \"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4\") " pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:37.899425 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899395 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54582370-009e-4add-bc9e-5a5c18069e72-iptables-alerter-script\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.899614 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899431 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-tmp-dir\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.899614 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8zlb\" (UniqueName: \"kubernetes.io/projected/54582370-009e-4add-bc9e-5a5c18069e72-kube-api-access-b8zlb\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.899751 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-hosts-file\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.899751 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5gv\" (UniqueName: \"kubernetes.io/projected/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-kube-api-access-tq5gv\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.899751 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54582370-009e-4add-bc9e-5a5c18069e72-host-slash\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.899862 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-tmp-dir\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.899862 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54582370-009e-4add-bc9e-5a5c18069e72-host-slash\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.899862 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.899851 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-hosts-file\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.900071 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.900048 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54582370-009e-4add-bc9e-5a5c18069e72-iptables-alerter-script\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.908119 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.908098 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8zlb\" (UniqueName: \"kubernetes.io/projected/54582370-009e-4add-bc9e-5a5c18069e72-kube-api-access-b8zlb\") pod \"iptables-alerter-g9ftj\" (UID: \"54582370-009e-4add-bc9e-5a5c18069e72\") " pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:37.908254 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.908237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5gv\" (UniqueName: \"kubernetes.io/projected/fc1a6ffe-e50d-41c1-aa12-db0a4d10232f-kube-api-access-tq5gv\") pod \"node-resolver-rj4q8\" (UID: \"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f\") " pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:37.985246 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.985165 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:04:37.992577 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:37.992555 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v9smg" Apr 16 10:04:37.993136 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:37.993034 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee8a5c7c_8f19_49e3_8fed_4d2a3920d7cf.slice/crio-07c038055e4cbbe7ffb0486c668f20a44fec8bfe44dd64d8723723900e1b88b7 WatchSource:0}: Error finding container 07c038055e4cbbe7ffb0486c668f20a44fec8bfe44dd64d8723723900e1b88b7: Status 404 returned error can't find the container with id 07c038055e4cbbe7ffb0486c668f20a44fec8bfe44dd64d8723723900e1b88b7 Apr 16 10:04:38.001036 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.001012 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hhvjb" Apr 16 10:04:38.001658 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:38.001632 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod191a958c_a1a7_4e33_9456_1f482a72fb5e.slice/crio-e1cefdf4a023377a9c4bec6638cf171d1a68917db3fe30892e5cf8721e91f097 WatchSource:0}: Error finding container e1cefdf4a023377a9c4bec6638cf171d1a68917db3fe30892e5cf8721e91f097: Status 404 returned error can't find the container with id e1cefdf4a023377a9c4bec6638cf171d1a68917db3fe30892e5cf8721e91f097 Apr 16 10:04:38.006951 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:38.006925 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b85402_76c2_472c_90f0_25a54604bbb9.slice/crio-f76ff64cace3bc66429600e11497c0a03daab16e038782e26a817562389c61bf WatchSource:0}: Error finding container f76ff64cace3bc66429600e11497c0a03daab16e038782e26a817562389c61bf: Status 404 returned error can't find the container with id f76ff64cace3bc66429600e11497c0a03daab16e038782e26a817562389c61bf Apr 16 10:04:38.007071 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.007055 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:04:38.013260 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:38.013222 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23649a52_557a_477c_838c_84b209078bbb.slice/crio-fcfc3311e599ac1b41f7caa2acdab9724a7efa8b60053cfa172f6d66c37bdcaf WatchSource:0}: Error finding container fcfc3311e599ac1b41f7caa2acdab9724a7efa8b60053cfa172f6d66c37bdcaf: Status 404 returned error can't find the container with id fcfc3311e599ac1b41f7caa2acdab9724a7efa8b60053cfa172f6d66c37bdcaf Apr 16 10:04:38.014113 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.013868 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v8sh7" Apr 16 10:04:38.021134 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.021113 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" Apr 16 10:04:38.021412 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:38.021393 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a61fb09_0793_4b69_b34c_784caf0249e5.slice/crio-d3633f6559b175e2268ed7124a235eecc3763a9e72ba3e227bb0f54939ea162d WatchSource:0}: Error finding container d3633f6559b175e2268ed7124a235eecc3763a9e72ba3e227bb0f54939ea162d: Status 404 returned error can't find the container with id d3633f6559b175e2268ed7124a235eecc3763a9e72ba3e227bb0f54939ea162d Apr 16 10:04:38.027109 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:38.027090 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb1190b_eb18_4a5c_91c6_5bec59d57dc4.slice/crio-eea324fe78144676e61ae9dcb4db081033d0d9dc43e1abe90c3a53dc943a4f5e WatchSource:0}: Error finding container eea324fe78144676e61ae9dcb4db081033d0d9dc43e1abe90c3a53dc943a4f5e: Status 404 returned error can't find the container with id eea324fe78144676e61ae9dcb4db081033d0d9dc43e1abe90c3a53dc943a4f5e Apr 16 10:04:38.027894 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.027876 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" Apr 16 10:04:38.034159 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:38.034137 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3597b6a_0a19_49d2_8f48_59d7320d0993.slice/crio-2cc667f96525bb4fab777718ddf1af62067dcd49c9997526c72171dc7cda1d88 WatchSource:0}: Error finding container 2cc667f96525bb4fab777718ddf1af62067dcd49c9997526c72171dc7cda1d88: Status 404 returned error can't find the container with id 2cc667f96525bb4fab777718ddf1af62067dcd49c9997526c72171dc7cda1d88 Apr 16 10:04:38.034873 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.034848 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rj4q8" Apr 16 10:04:38.041940 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:38.041913 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc1a6ffe_e50d_41c1_aa12_db0a4d10232f.slice/crio-4a23d8d833c36d91afff3007597b8100f8aa27f03643a31a61b50e6d3edf59f9 WatchSource:0}: Error finding container 4a23d8d833c36d91afff3007597b8100f8aa27f03643a31a61b50e6d3edf59f9: Status 404 returned error can't find the container with id 4a23d8d833c36d91afff3007597b8100f8aa27f03643a31a61b50e6d3edf59f9 Apr 16 10:04:38.048227 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.048207 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-g9ftj" Apr 16 10:04:38.056339 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:04:38.056313 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54582370_009e_4add_bc9e_5a5c18069e72.slice/crio-f2ef90ef6dcda07f394debeb60cd949903261bfefeea2bf7f3c094e9559e6cd1 WatchSource:0}: Error finding container f2ef90ef6dcda07f394debeb60cd949903261bfefeea2bf7f3c094e9559e6cd1: Status 404 returned error can't find the container with id f2ef90ef6dcda07f394debeb60cd949903261bfefeea2bf7f3c094e9559e6cd1 Apr 16 10:04:38.174111 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.173848 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-znlvq"] Apr 16 10:04:38.177044 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.177013 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.177178 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.177111 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:38.201122 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.201090 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-kubelet-config\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.201226 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.201130 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-dbus\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.201226 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.201202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.302563 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.302476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-kubelet-config\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.302563 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.302511 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-dbus\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.302563 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.302551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:38.302820 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.302585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.302820 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.302614 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-kubelet-config\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.302820 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.302687 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:38.302820 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.302699 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-dbus\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.302820 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.302743 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret podName:e9fc47e9-a934-4e31-b64d-70aef48ffd3e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:38.802725694 +0000 UTC m=+3.583134860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret") pod "global-pull-secret-syncer-znlvq" (UID: "e9fc47e9-a934-4e31-b64d-70aef48ffd3e") : object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:38.302820 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.302744 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:38.302820 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.302802 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:39.30278476 +0000 UTC m=+4.083193926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:38.403428 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.403397 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:38.403585 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.403551 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:38.403585 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.403567 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:38.403585 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.403576 2570 projected.go:194] Error preparing data for projected volume kube-api-access-mhjtp for pod openshift-network-diagnostics/network-check-target-2h7wk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:38.403687 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.403624 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp podName:dba5a2c9-dd08-4be6-a76d-85742ada944e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:39.403610084 +0000 UTC m=+4.184019260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mhjtp" (UniqueName: "kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp") pod "network-check-target-2h7wk" (UID: "dba5a2c9-dd08-4be6-a76d-85742ada944e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:38.736879 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.736793 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 09:59:36 +0000 UTC" deadline="2027-10-28 05:38:19.728691635 +0000 UTC" Apr 16 10:04:38.736879 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.736831 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13435h33m40.991864663s" Apr 16 10:04:38.794615 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.794582 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:38.794759 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.794712 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:38.806975 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.806461 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:38.806975 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.806616 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:38.806975 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:38.806671 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret podName:e9fc47e9-a934-4e31-b64d-70aef48ffd3e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:39.806654135 +0000 UTC m=+4.587063315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret") pod "global-pull-secret-syncer-znlvq" (UID: "e9fc47e9-a934-4e31-b64d-70aef48ffd3e") : object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:38.817608 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.817511 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" event={"ID":"e3597b6a-0a19-49d2-8f48-59d7320d0993","Type":"ContainerStarted","Data":"2cc667f96525bb4fab777718ddf1af62067dcd49c9997526c72171dc7cda1d88"} Apr 16 10:04:38.823271 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.823245 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" event={"ID":"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4","Type":"ContainerStarted","Data":"eea324fe78144676e61ae9dcb4db081033d0d9dc43e1abe90c3a53dc943a4f5e"} Apr 16 10:04:38.832862 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.832806 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v8sh7" event={"ID":"4a61fb09-0793-4b69-b34c-784caf0249e5","Type":"ContainerStarted","Data":"d3633f6559b175e2268ed7124a235eecc3763a9e72ba3e227bb0f54939ea162d"} Apr 16 10:04:38.843042 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.843017 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"fcfc3311e599ac1b41f7caa2acdab9724a7efa8b60053cfa172f6d66c37bdcaf"} Apr 16 10:04:38.847022 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.846988 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-g9ftj" event={"ID":"54582370-009e-4add-bc9e-5a5c18069e72","Type":"ContainerStarted","Data":"f2ef90ef6dcda07f394debeb60cd949903261bfefeea2bf7f3c094e9559e6cd1"} Apr 16 10:04:38.857173 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.857147 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rj4q8" event={"ID":"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f","Type":"ContainerStarted","Data":"4a23d8d833c36d91afff3007597b8100f8aa27f03643a31a61b50e6d3edf59f9"} Apr 16 10:04:38.864219 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.864191 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hhvjb" event={"ID":"22b85402-76c2-472c-90f0-25a54604bbb9","Type":"ContainerStarted","Data":"f76ff64cace3bc66429600e11497c0a03daab16e038782e26a817562389c61bf"} Apr 16 10:04:38.866509 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.866484 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerStarted","Data":"e1cefdf4a023377a9c4bec6638cf171d1a68917db3fe30892e5cf8721e91f097"} Apr 16 10:04:38.872092 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:38.872069 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s7wkr" event={"ID":"ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf","Type":"ContainerStarted","Data":"07c038055e4cbbe7ffb0486c668f20a44fec8bfe44dd64d8723723900e1b88b7"} Apr 16 10:04:39.310755 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:39.310718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:39.310927 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.310887 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:39.310983 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.310965 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:41.310944652 +0000 UTC m=+6.091353834 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:39.412098 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:39.412055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:39.412267 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.412242 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:39.412267 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.412261 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:39.412367 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.412275 2570 projected.go:194] Error preparing data for projected volume kube-api-access-mhjtp for pod openshift-network-diagnostics/network-check-target-2h7wk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:39.412367 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.412334 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp podName:dba5a2c9-dd08-4be6-a76d-85742ada944e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:41.412315311 +0000 UTC m=+6.192724493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mhjtp" (UniqueName: "kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp") pod "network-check-target-2h7wk" (UID: "dba5a2c9-dd08-4be6-a76d-85742ada944e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:39.797634 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:39.796921 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:39.797634 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.797050 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:39.797634 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:39.797426 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:39.799452 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.797514 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:39.815158 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:39.815130 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:39.815307 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.815291 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:39.815377 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:39.815352 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret podName:e9fc47e9-a934-4e31-b64d-70aef48ffd3e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:41.815333102 +0000 UTC m=+6.595742267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret") pod "global-pull-secret-syncer-znlvq" (UID: "e9fc47e9-a934-4e31-b64d-70aef48ffd3e") : object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:40.795599 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:40.794862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:40.795599 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:40.795220 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:40.884352 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:40.884268 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" event={"ID":"df50e37f169a5ef9261defc9686381a1","Type":"ContainerStarted","Data":"37be15147448bd6fd51779aee77e022a9d26b31d6f1ff7558f6a0abfed748173"} Apr 16 10:04:41.327412 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:41.327372 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:41.327597 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.327569 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:41.327653 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.327634 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:45.327614368 +0000 UTC m=+10.108023655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:41.428661 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:41.428625 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:41.428851 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.428780 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:41.428851 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.428797 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:41.428851 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.428810 2570 projected.go:194] Error preparing data for projected volume kube-api-access-mhjtp for pod openshift-network-diagnostics/network-check-target-2h7wk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:41.429011 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.428869 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp podName:dba5a2c9-dd08-4be6-a76d-85742ada944e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:45.428850966 +0000 UTC m=+10.209260132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mhjtp" (UniqueName: "kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp") pod "network-check-target-2h7wk" (UID: "dba5a2c9-dd08-4be6-a76d-85742ada944e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:41.795429 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:41.794791 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:41.795429 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.794932 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:41.795429 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:41.795333 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:41.795854 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.795809 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:41.832253 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:41.832198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:41.832423 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.832334 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:41.832423 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:41.832396 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret podName:e9fc47e9-a934-4e31-b64d-70aef48ffd3e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:45.832377877 +0000 UTC m=+10.612787053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret") pod "global-pull-secret-syncer-znlvq" (UID: "e9fc47e9-a934-4e31-b64d-70aef48ffd3e") : object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:42.794548 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:42.794491 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:42.795051 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:42.794641 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:43.794542 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:43.794499 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:43.794711 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:43.794653 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:43.794711 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:43.794671 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:43.795111 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:43.794743 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:43.892300 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:43.892261 2570 generic.go:358] "Generic (PLEG): container finished" podID="df50e37f169a5ef9261defc9686381a1" containerID="37be15147448bd6fd51779aee77e022a9d26b31d6f1ff7558f6a0abfed748173" exitCode=0 Apr 16 10:04:43.892452 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:43.892313 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" event={"ID":"df50e37f169a5ef9261defc9686381a1","Type":"ContainerDied","Data":"37be15147448bd6fd51779aee77e022a9d26b31d6f1ff7558f6a0abfed748173"} Apr 16 10:04:44.795105 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:44.795067 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:44.795576 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:44.795205 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:45.360333 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:45.360272 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:45.360518 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.360442 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:45.360609 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.360568 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:04:53.360548177 +0000 UTC m=+18.140957358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:45.461327 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:45.461283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:45.461511 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.461456 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:45.461511 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.461481 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:45.461511 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.461495 2570 projected.go:194] Error preparing data for projected volume kube-api-access-mhjtp for pod openshift-network-diagnostics/network-check-target-2h7wk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:45.461688 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.461573 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp podName:dba5a2c9-dd08-4be6-a76d-85742ada944e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:53.461553178 +0000 UTC m=+18.241962347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mhjtp" (UniqueName: "kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp") pod "network-check-target-2h7wk" (UID: "dba5a2c9-dd08-4be6-a76d-85742ada944e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:45.795033 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:45.795000 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:45.795196 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:45.795048 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:45.795196 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.795135 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:45.795626 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.795238 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:45.864786 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:45.864520 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:45.864786 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.864657 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:45.864786 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:45.864741 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret podName:e9fc47e9-a934-4e31-b64d-70aef48ffd3e nodeName:}" failed. No retries permitted until 2026-04-16 10:04:53.864721046 +0000 UTC m=+18.645130238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret") pod "global-pull-secret-syncer-znlvq" (UID: "e9fc47e9-a934-4e31-b64d-70aef48ffd3e") : object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:46.794775 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:46.794741 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:46.794946 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:46.794859 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:47.794390 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:47.794359 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:47.794766 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:47.794467 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:47.794766 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:47.794538 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:47.794766 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:47.794624 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:48.794780 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:48.794745 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:48.795201 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:48.794852 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:49.794587 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:49.794546 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:49.794763 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:49.794683 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:49.794763 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:49.794744 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:49.795161 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:49.794854 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:50.795088 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:50.795057 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:50.795483 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:50.795174 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:51.795015 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:51.794981 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:51.795234 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:51.794990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:51.795234 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:51.795092 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:51.795234 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:51.795177 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:52.794776 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:52.794744 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:52.794993 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:52.794844 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:53.418923 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:53.418885 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:53.419316 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.419020 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:53.419316 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.419118 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:09.419097559 +0000 UTC m=+34.199506729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:04:53.519413 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:53.519380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:53.519585 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.519564 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:04:53.519585 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.519584 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:04:53.519687 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.519601 2570 projected.go:194] Error preparing data for projected volume kube-api-access-mhjtp for pod openshift-network-diagnostics/network-check-target-2h7wk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:53.519687 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.519661 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp podName:dba5a2c9-dd08-4be6-a76d-85742ada944e nodeName:}" failed. No retries permitted until 2026-04-16 10:05:09.519647592 +0000 UTC m=+34.300056761 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mhjtp" (UniqueName: "kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp") pod "network-check-target-2h7wk" (UID: "dba5a2c9-dd08-4be6-a76d-85742ada944e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:04:53.794869 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:53.794825 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:53.795031 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:53.794836 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:53.795031 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.794960 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:53.795140 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.795117 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:53.922799 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:53.922761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:53.922974 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.922893 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:53.922974 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:53.922952 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret podName:e9fc47e9-a934-4e31-b64d-70aef48ffd3e nodeName:}" failed. No retries permitted until 2026-04-16 10:05:09.922939019 +0000 UTC m=+34.703348183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret") pod "global-pull-secret-syncer-znlvq" (UID: "e9fc47e9-a934-4e31-b64d-70aef48ffd3e") : object "kube-system"/"original-pull-secret" not registered Apr 16 10:04:54.794729 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:54.794698 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:54.795120 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:54.794818 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:55.795239 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:55.795205 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:55.795763 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:55.795286 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:55.795763 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:55.795354 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:55.795763 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:55.795436 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:56.795095 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.794771 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:56.795307 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:56.795194 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:56.923001 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.922917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hhvjb" event={"ID":"22b85402-76c2-472c-90f0-25a54604bbb9","Type":"ContainerStarted","Data":"492b730fcb4ed1e0c35d8ae08f6f5045a0cf06b6df35886c6b379dfd35c9a501"} Apr 16 10:04:56.924232 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.924199 2570 generic.go:358] "Generic (PLEG): container finished" podID="191a958c-a1a7-4e33-9456-1f482a72fb5e" containerID="939c54f883d252bd97becd8f3d3c645dedb274046424d20c8c6c02e40ce1ead5" exitCode=0 Apr 16 10:04:56.924232 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.924228 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerDied","Data":"939c54f883d252bd97becd8f3d3c645dedb274046424d20c8c6c02e40ce1ead5"} Apr 16 10:04:56.925648 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.925624 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s7wkr" event={"ID":"ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf","Type":"ContainerStarted","Data":"9fd7650252bb0fb7821fbb42764292a5db325a7dd57e359bcdb8184bc5c4f35f"} Apr 16 10:04:56.927482 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.927457 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" event={"ID":"df50e37f169a5ef9261defc9686381a1","Type":"ContainerStarted","Data":"a6b885ccabe9e3b5770a7e405e0826ff531d76e9a4463aabe3a290e2aad9fb41"} Apr 16 10:04:56.928634 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.928613 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" event={"ID":"9c721cdecb4375340a2fbe75779b609c","Type":"ContainerStarted","Data":"b27447601410544612e0e8226d7bb2a1944a3cad07f93479faebcdb2d60b05b5"} Apr 16 10:04:56.929815 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.929797 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" event={"ID":"e3597b6a-0a19-49d2-8f48-59d7320d0993","Type":"ContainerStarted","Data":"5eab623858c05cc4a44469c3cb73c0bc38526c20667147ffe3ed317162bdf5c5"} Apr 16 10:04:56.930933 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.930915 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" event={"ID":"9fb1190b-eb18-4a5c-91c6-5bec59d57dc4","Type":"ContainerStarted","Data":"99939ad14a392877ca3b6762552813bc259e9a994a8c102658cebd593e57951e"} Apr 16 10:04:56.932043 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.932024 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v8sh7" event={"ID":"4a61fb09-0793-4b69-b34c-784caf0249e5","Type":"ContainerStarted","Data":"b622969a465593ac88cbd038d5ba8d84f1cf9fe78a79c2944babefb83cbe0d50"} Apr 16 10:04:56.934346 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.934331 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:04:56.934624 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.934608 2570 generic.go:358] "Generic (PLEG): container finished" podID="23649a52-557a-477c-838c-84b209078bbb" containerID="d69bc80078fb4ce0b0fe6de56b1fda86add3a7887e3e55bfe37f0773f3221c59" exitCode=1 Apr 16 10:04:56.934677 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.934658 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"82d3a3d9e579ed6671c836e1a2f622a8567ade3cd7fe6a1c6e2ac0818fbfb1bf"} Apr 16 10:04:56.934677 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.934673 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"c18af189b5b34a64d3114e782ca153c15cc26aeafd4d8fa8e6fe17d5414a272c"} Apr 16 10:04:56.934746 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.934682 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"6499675b8094373683b9d7805856edcabc711244f4216afaa23737c1aed39f5a"} Apr 16 10:04:56.934746 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.934690 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"d0d3e3a7f8a89879dd40352283b856ff14dc89cfc15eb6b076f260a04c2ed3e6"} Apr 16 10:04:56.934746 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.934698 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerDied","Data":"d69bc80078fb4ce0b0fe6de56b1fda86add3a7887e3e55bfe37f0773f3221c59"} Apr 16 10:04:56.934746 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.934708 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"59daf2628c7ac4daa7f0678ebf522819a6dd6bc4dff6d4ce374dc69b77417900"} Apr 16 10:04:56.935676 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.935640 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hhvjb" podStartSLOduration=3.915867623 podStartE2EDuration="21.935627602s" podCreationTimestamp="2026-04-16 10:04:35 +0000 UTC" firstStartedPulling="2026-04-16 10:04:38.008628519 +0000 UTC m=+2.789037687" lastFinishedPulling="2026-04-16 10:04:56.028388487 +0000 UTC m=+20.808797666" observedRunningTime="2026-04-16 10:04:56.935376842 +0000 UTC m=+21.715786053" watchObservedRunningTime="2026-04-16 10:04:56.935627602 +0000 UTC m=+21.716036790" Apr 16 10:04:56.935794 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.935776 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rj4q8" event={"ID":"fc1a6ffe-e50d-41c1-aa12-db0a4d10232f","Type":"ContainerStarted","Data":"a94c24a8dc37f898ae38c65da9bef5a9fa0a3f09835135f2f6a7ee0fd2a3554e"} Apr 16 10:04:56.947280 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.947242 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v8sh7" podStartSLOduration=2.9923867790000003 podStartE2EDuration="20.947232513s" podCreationTimestamp="2026-04-16 10:04:36 +0000 UTC" firstStartedPulling="2026-04-16 10:04:38.0241396 +0000 UTC m=+2.804548765" lastFinishedPulling="2026-04-16 10:04:55.978985331 +0000 UTC m=+20.759394499" observedRunningTime="2026-04-16 10:04:56.947064768 +0000 UTC m=+21.727473955" watchObservedRunningTime="2026-04-16 10:04:56.947232513 +0000 UTC m=+21.727641700" Apr 16 10:04:56.957547 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.957503 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-196.ec2.internal" podStartSLOduration=19.957493753 podStartE2EDuration="19.957493753s" podCreationTimestamp="2026-04-16 10:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:04:56.957099022 +0000 UTC m=+21.737508209" watchObservedRunningTime="2026-04-16 10:04:56.957493753 +0000 UTC m=+21.737902939" Apr 16 10:04:56.988029 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:56.987977 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-s7wkr" podStartSLOduration=4.009349545 podStartE2EDuration="21.987964464s" podCreationTimestamp="2026-04-16 10:04:35 +0000 UTC" firstStartedPulling="2026-04-16 10:04:37.997450705 +0000 UTC m=+2.777859869" lastFinishedPulling="2026-04-16 10:04:55.976065608 +0000 UTC m=+20.756474788" observedRunningTime="2026-04-16 10:04:56.969086216 +0000 UTC m=+21.749495403" watchObservedRunningTime="2026-04-16 10:04:56.987964464 +0000 UTC m=+21.768373651" Apr 16 10:04:57.001976 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.001916 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8hjnc" podStartSLOduration=3.042384921 podStartE2EDuration="21.001894807s" podCreationTimestamp="2026-04-16 10:04:36 +0000 UTC" firstStartedPulling="2026-04-16 10:04:38.029046506 +0000 UTC m=+2.809455676" lastFinishedPulling="2026-04-16 10:04:55.988556398 +0000 UTC m=+20.768965562" observedRunningTime="2026-04-16 10:04:57.00107729 +0000 UTC m=+21.781486480" watchObservedRunningTime="2026-04-16 10:04:57.001894807 +0000 UTC m=+21.782303997" Apr 16 10:04:57.012806 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.012771 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-196.ec2.internal" podStartSLOduration=20.012760534 podStartE2EDuration="20.012760534s" podCreationTimestamp="2026-04-16 10:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:04:57.012213079 +0000 UTC m=+21.792622286" watchObservedRunningTime="2026-04-16 10:04:57.012760534 +0000 UTC m=+21.793169721" Apr 16 10:04:57.038265 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.038214 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rj4q8" podStartSLOduration=3.105788759 podStartE2EDuration="21.038199506s" podCreationTimestamp="2026-04-16 10:04:36 +0000 UTC" firstStartedPulling="2026-04-16 10:04:38.043650647 +0000 UTC m=+2.824059813" lastFinishedPulling="2026-04-16 10:04:55.976061394 +0000 UTC m=+20.756470560" observedRunningTime="2026-04-16 10:04:57.037979559 +0000 UTC m=+21.818388765" watchObservedRunningTime="2026-04-16 10:04:57.038199506 +0000 UTC m=+21.818608693" Apr 16 10:04:57.281176 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.281155 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 10:04:57.757966 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.757872 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T10:04:57.281172286Z","UUID":"dcacc849-810f-496f-b17d-7408602fd244","Handler":null,"Name":"","Endpoint":""} Apr 16 10:04:57.760158 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.760051 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 10:04:57.760158 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.760080 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 10:04:57.794933 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.794907 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:57.795076 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.794950 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:57.795076 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:57.795053 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:57.795232 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:57.795199 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:04:57.940176 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.940138 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" event={"ID":"e3597b6a-0a19-49d2-8f48-59d7320d0993","Type":"ContainerStarted","Data":"ff30cf280f70687e9640c01eb8281617e4e74ee3b6e0a36ac0eca7b204ec6bc4"} Apr 16 10:04:57.941976 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.941946 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-g9ftj" event={"ID":"54582370-009e-4add-bc9e-5a5c18069e72","Type":"ContainerStarted","Data":"4b3e28a9f4af598055977bfb8bbd0ff48530b33742874bc297b38a824a808e58"} Apr 16 10:04:57.964760 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:57.964718 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-g9ftj" podStartSLOduration=4.039736287 podStartE2EDuration="21.964700299s" podCreationTimestamp="2026-04-16 10:04:36 +0000 UTC" firstStartedPulling="2026-04-16 10:04:38.058073153 +0000 UTC m=+2.838482321" lastFinishedPulling="2026-04-16 10:04:55.983037153 +0000 UTC m=+20.763446333" observedRunningTime="2026-04-16 10:04:57.964556594 +0000 UTC m=+22.744965783" watchObservedRunningTime="2026-04-16 10:04:57.964700299 +0000 UTC m=+22.745109487" Apr 16 10:04:58.794473 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:58.794451 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:04:58.794630 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:58.794578 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:04:58.945308 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:58.945226 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" event={"ID":"e3597b6a-0a19-49d2-8f48-59d7320d0993","Type":"ContainerStarted","Data":"bc6fe2af6e7e1577155984e222582b7816792e35d61d4961fe41698afcc19f26"} Apr 16 10:04:58.948338 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:58.948314 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:04:58.948804 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:58.948755 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"9c60e2ecbf80f816079186016a16682c63c703dbae1683f10cfba0d81397bf19"} Apr 16 10:04:58.962460 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:58.962418 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-prdq2" podStartSLOduration=2.439564353 podStartE2EDuration="22.96240497s" podCreationTimestamp="2026-04-16 10:04:36 +0000 UTC" firstStartedPulling="2026-04-16 10:04:38.036264657 +0000 UTC m=+2.816673826" lastFinishedPulling="2026-04-16 10:04:58.559105277 +0000 UTC m=+23.339514443" observedRunningTime="2026-04-16 10:04:58.961946288 +0000 UTC m=+23.742355474" watchObservedRunningTime="2026-04-16 10:04:58.96240497 +0000 UTC m=+23.742814156" Apr 16 10:04:59.794775 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:59.794741 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:04:59.794775 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:04:59.794775 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:04:59.795002 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:59.794868 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:04:59.795061 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:04:59.795030 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:05:00.036414 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:00.036365 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:05:00.795333 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:00.795064 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:00.795495 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:00.795380 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:05:01.099082 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.098699 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:05:01.100865 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.099240 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:05:01.794080 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.794055 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:01.794080 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.794076 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:05:01.794325 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:01.794166 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:05:01.794325 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:01.794300 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:05:01.955344 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.955160 2570 generic.go:358] "Generic (PLEG): container finished" podID="191a958c-a1a7-4e33-9456-1f482a72fb5e" containerID="004fee5fe1bf8e0ef81cf841147ad1c6fe379e9edeee9ea216501cdc287173cd" exitCode=0 Apr 16 10:05:01.955493 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.955208 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerDied","Data":"004fee5fe1bf8e0ef81cf841147ad1c6fe379e9edeee9ea216501cdc287173cd"} Apr 16 10:05:01.958314 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.958297 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:05:01.958629 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.958604 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"23b5ffadc50fb84ed4ad39ecdae3f1a22ce6f83e17c9826c01b45d5f1f57e0a2"} Apr 16 10:05:01.959046 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.959030 2570 scope.go:117] "RemoveContainer" containerID="d69bc80078fb4ce0b0fe6de56b1fda86add3a7887e3e55bfe37f0773f3221c59" Apr 16 10:05:01.959208 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:01.959192 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-s7wkr" Apr 16 10:05:02.794616 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.794573 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:02.795143 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:02.794710 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:05:02.948880 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.948852 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-znlvq"] Apr 16 10:05:02.949012 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.948990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:02.949124 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:02.949102 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:05:02.952264 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.952241 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9mqb2"] Apr 16 10:05:02.952387 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.952347 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:05:02.952462 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:02.952437 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:05:02.952988 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.952969 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2h7wk"] Apr 16 10:05:02.964505 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.964484 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:05:02.964892 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.964869 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" event={"ID":"23649a52-557a-477c-838c-84b209078bbb","Type":"ContainerStarted","Data":"bf60bc1e8caa5e082a8176b4f883d04e5aea2636cdd58cb5ba7407128e5da894"} Apr 16 10:05:02.964967 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.964905 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:02.965027 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:02.965007 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:05:02.965461 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.965440 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:05:02.965709 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.965479 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:05:02.965709 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.965503 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:05:02.980394 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.980372 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:05:02.981446 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.981427 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:05:02.994881 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:02.994838 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" podStartSLOduration=9.939253464 podStartE2EDuration="27.994823743s" podCreationTimestamp="2026-04-16 10:04:35 +0000 UTC" firstStartedPulling="2026-04-16 10:04:38.015152157 +0000 UTC m=+2.795561327" lastFinishedPulling="2026-04-16 10:04:56.070722429 +0000 UTC m=+20.851131606" observedRunningTime="2026-04-16 10:05:02.993286477 +0000 UTC m=+27.773695695" watchObservedRunningTime="2026-04-16 10:05:02.994823743 +0000 UTC m=+27.775232930" Apr 16 10:05:03.967796 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:03.967764 2570 generic.go:358] "Generic (PLEG): container finished" podID="191a958c-a1a7-4e33-9456-1f482a72fb5e" containerID="771c2e29fa3f7b5b8afe75961add96759ab10acbbbf7b30ae8d75d142f81b0e1" exitCode=0 Apr 16 10:05:03.968162 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:03.967845 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerDied","Data":"771c2e29fa3f7b5b8afe75961add96759ab10acbbbf7b30ae8d75d142f81b0e1"} Apr 16 10:05:04.794744 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:04.794715 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:04.794850 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:04.794715 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:04.794850 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:04.794836 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:05:04.794908 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:04.794831 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:05:04.794998 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:04.794980 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:05:04.795131 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:04.795102 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:05:04.971748 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:04.971716 2570 generic.go:358] "Generic (PLEG): container finished" podID="191a958c-a1a7-4e33-9456-1f482a72fb5e" containerID="af7ebd117837efb373b6e42c4969549e374ff555d3a2458e2215040584449566" exitCode=0 Apr 16 10:05:04.972082 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:04.971789 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerDied","Data":"af7ebd117837efb373b6e42c4969549e374ff555d3a2458e2215040584449566"} Apr 16 10:05:06.794890 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:06.794858 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:06.795393 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:06.794863 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:05:06.795393 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:06.794957 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:05:06.795393 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:06.794858 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:06.795393 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:06.795085 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:05:06.795393 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:06.795179 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:05:08.794903 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:08.794697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:05:08.795371 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:08.794697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:08.795371 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:08.795012 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:05:08.795371 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:08.794697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:08.795371 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:08.795092 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-znlvq" podUID="e9fc47e9-a934-4e31-b64d-70aef48ffd3e" Apr 16 10:05:08.795371 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:08.795148 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2h7wk" podUID="dba5a2c9-dd08-4be6-a76d-85742ada944e" Apr 16 10:05:09.005332 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.005261 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-196.ec2.internal" event="NodeReady" Apr 16 10:05:09.005477 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.005393 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 10:05:09.038808 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.038775 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5"] Apr 16 10:05:09.073851 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.073814 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69c9cc7857-lwqxr"] Apr 16 10:05:09.074011 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.073996 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:09.076376 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.076349 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 10:05:09.076597 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.076399 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-652wv\"" Apr 16 10:05:09.076597 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.076567 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 10:05:09.101832 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.101807 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p"] Apr 16 10:05:09.101984 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.101963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.104382 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.104361 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 10:05:09.104512 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.104392 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 10:05:09.104786 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.104768 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 10:05:09.104872 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.104833 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kpfjj\"" Apr 16 10:05:09.111283 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.111260 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 10:05:09.114590 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.114571 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw"] Apr 16 10:05:09.114726 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.114706 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" Apr 16 10:05:09.117279 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.117137 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-lchvf\"" Apr 16 10:05:09.117279 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.117188 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 10:05:09.117279 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.117141 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 10:05:09.117499 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.117298 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 10:05:09.117499 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.117327 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 10:05:09.142339 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.142314 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5"] Apr 16 10:05:09.142479 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.142458 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.144765 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.144748 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 10:05:09.161125 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.161102 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5"] Apr 16 10:05:09.161228 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.161132 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dwt6x"] Apr 16 10:05:09.161281 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.161254 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.164655 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.164635 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 10:05:09.164891 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.164871 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 10:05:09.164975 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.164942 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 10:05:09.165303 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.165284 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 10:05:09.175201 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.175183 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4hw98"] Apr 16 10:05:09.175356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.175335 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:09.177760 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.177742 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cwtl6\"" Apr 16 10:05:09.177853 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.177816 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 10:05:09.177913 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.177850 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 10:05:09.177991 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.177970 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 10:05:09.196378 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.196349 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw"] Apr 16 10:05:09.196471 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.196385 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5"] Apr 16 10:05:09.196471 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.196417 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69c9cc7857-lwqxr"] Apr 16 10:05:09.196471 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.196426 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dwt6x"] Apr 16 10:05:09.196471 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.196434 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4hw98"] Apr 16 10:05:09.196471 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.196441 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p"] Apr 16 10:05:09.196690 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.196483 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.198770 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.198752 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ccsn6\"" Apr 16 10:05:09.198867 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.198755 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 10:05:09.198867 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.198807 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 10:05:09.245938 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.245909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e2919cf8-ac8b-4c58-be68-19e8c41e0c82-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76864b55b9-5wp8p\" (UID: \"e2919cf8-ac8b-4c58-be68-19e8c41e0c82\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" Apr 16 10:05:09.246120 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.245956 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b74f7a44-b523-4723-8224-aa8bc5a5759c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:09.246120 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.245976 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:09.246120 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.245995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.246120 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246064 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.246120 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-image-registry-private-configuration\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.246320 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246137 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-certificates\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.246320 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246165 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ed278555-41a2-4c45-aac6-3d9454782e11-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.246320 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246190 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-ca\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.246320 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246216 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.246320 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-trusted-ca\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.246320 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246312 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh4wc\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-kube-api-access-fh4wc\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.246647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246340 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-installation-pull-secrets\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.246647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246367 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.246647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246432 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-hub\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.246647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246484 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-ca-trust-extracted\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.246647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-bound-sa-token\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.246647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp8l8\" (UniqueName: \"kubernetes.io/projected/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-kube-api-access-mp8l8\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.246647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246602 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4rk4\" (UniqueName: \"kubernetes.io/projected/e2919cf8-ac8b-4c58-be68-19e8c41e0c82-kube-api-access-k4rk4\") pod \"managed-serviceaccount-addon-agent-76864b55b9-5wp8p\" (UID: \"e2919cf8-ac8b-4c58-be68-19e8c41e0c82\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" Apr 16 10:05:09.246647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed278555-41a2-4c45-aac6-3d9454782e11-tmp\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.246924 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.246650 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxqr\" (UniqueName: \"kubernetes.io/projected/ed278555-41a2-4c45-aac6-3d9454782e11-kube-api-access-kzxqr\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.347843 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.347805 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.347843 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.347847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-trusted-ca\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.348077 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.347874 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:09.348077 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.347970 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 10:05:09.348077 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.348037 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert podName:b74f7a44-b523-4723-8224-aa8bc5a5759c nodeName:}" failed. No retries permitted until 2026-04-16 10:05:09.848009278 +0000 UTC m=+34.628418451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-qjjn5" (UID: "b74f7a44-b523-4723-8224-aa8bc5a5759c") : secret "networking-console-plugin-cert" not found Apr 16 10:05:09.348077 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ed278555-41a2-4c45-aac6-3d9454782e11-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.348277 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.348277 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348114 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh89p\" (UniqueName: \"kubernetes.io/projected/f8ed2d4a-7837-47c0-aa78-9cb656214b23-kube-api-access-nh89p\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.348277 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8ed2d4a-7837-47c0-aa78-9cb656214b23-config-volume\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.348277 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-ca\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.348277 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.348277 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-installation-pull-secrets\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.348667 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348644 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.348796 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8ed2d4a-7837-47c0-aa78-9cb656214b23-tmp-dir\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.348862 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-hub\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.348913 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348869 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-ca-trust-extracted\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.348913 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348895 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:09.348995 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxqr\" (UniqueName: \"kubernetes.io/projected/ed278555-41a2-4c45-aac6-3d9454782e11-kube-api-access-kzxqr\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.348995 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348937 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-trusted-ca\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.348995 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.348949 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e2919cf8-ac8b-4c58-be68-19e8c41e0c82-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76864b55b9-5wp8p\" (UID: \"e2919cf8-ac8b-4c58-be68-19e8c41e0c82\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" Apr 16 10:05:09.349113 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349027 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.349113 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b74f7a44-b523-4723-8224-aa8bc5a5759c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:09.349113 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349084 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.349296 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349111 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-image-registry-private-configuration\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.349296 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349136 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-certificates\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.349296 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh4wc\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-kube-api-access-fh4wc\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.349296 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349217 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-bound-sa-token\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.349296 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349240 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp8l8\" (UniqueName: \"kubernetes.io/projected/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-kube-api-access-mp8l8\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.349296 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4rk4\" (UniqueName: \"kubernetes.io/projected/e2919cf8-ac8b-4c58-be68-19e8c41e0c82-kube-api-access-k4rk4\") pod \"managed-serviceaccount-addon-agent-76864b55b9-5wp8p\" (UID: \"e2919cf8-ac8b-4c58-be68-19e8c41e0c82\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" Apr 16 10:05:09.349560 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed278555-41a2-4c45-aac6-3d9454782e11-tmp\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.349560 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.349327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brcq6\" (UniqueName: \"kubernetes.io/projected/9007958c-4532-4e2d-a3b7-94207d6d562a-kube-api-access-brcq6\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:09.349560 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.349447 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:05:09.349560 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.349458 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c9cc7857-lwqxr: secret "image-registry-tls" not found Apr 16 10:05:09.349560 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.349503 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls podName:4b799a1d-5b70-47ff-b95d-e1cc88088ee8 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:09.849489885 +0000 UTC m=+34.629899050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls") pod "image-registry-69c9cc7857-lwqxr" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8") : secret "image-registry-tls" not found Apr 16 10:05:09.350375 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.350068 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b74f7a44-b523-4723-8224-aa8bc5a5759c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:09.353764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353312 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-ca\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.353764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353395 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-installation-pull-secrets\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.353764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353423 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-image-registry-private-configuration\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.353764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353448 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ed278555-41a2-4c45-aac6-3d9454782e11-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.353764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353572 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e2919cf8-ac8b-4c58-be68-19e8c41e0c82-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76864b55b9-5wp8p\" (UID: \"e2919cf8-ac8b-4c58-be68-19e8c41e0c82\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" Apr 16 10:05:09.353764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353674 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.353764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353703 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed278555-41a2-4c45-aac6-3d9454782e11-tmp\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.353764 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353728 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-ca-trust-extracted\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.354212 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.353976 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-certificates\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.354212 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.354169 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-hub\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.354643 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.354622 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.364199 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.364178 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp8l8\" (UniqueName: \"kubernetes.io/projected/498de4dd-364c-48a3-a4a8-9b4d0bce5ce1-kube-api-access-mp8l8\") pod \"cluster-proxy-proxy-agent-686b9879fb-cx6k5\" (UID: \"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.364541 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.364482 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-bound-sa-token\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.364947 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.364906 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4rk4\" (UniqueName: \"kubernetes.io/projected/e2919cf8-ac8b-4c58-be68-19e8c41e0c82-kube-api-access-k4rk4\") pod \"managed-serviceaccount-addon-agent-76864b55b9-5wp8p\" (UID: \"e2919cf8-ac8b-4c58-be68-19e8c41e0c82\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" Apr 16 10:05:09.365267 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.365248 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh4wc\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-kube-api-access-fh4wc\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.365964 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.365946 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxqr\" (UniqueName: \"kubernetes.io/projected/ed278555-41a2-4c45-aac6-3d9454782e11-kube-api-access-kzxqr\") pod \"klusterlet-addon-workmgr-6d5df4c6cb-fm7kw\" (UID: \"ed278555-41a2-4c45-aac6-3d9454782e11\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.432987 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.432951 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" Apr 16 10:05:09.450089 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:09.450227 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450118 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.450227 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:05:09.450227 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brcq6\" (UniqueName: \"kubernetes.io/projected/9007958c-4532-4e2d-a3b7-94207d6d562a-kube-api-access-brcq6\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:09.450387 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.450222 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:09.450387 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.450269 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:09.450387 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.450298 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert podName:9007958c-4532-4e2d-a3b7-94207d6d562a nodeName:}" failed. No retries permitted until 2026-04-16 10:05:09.950276626 +0000 UTC m=+34.730685793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert") pod "ingress-canary-dwt6x" (UID: "9007958c-4532-4e2d-a3b7-94207d6d562a") : secret "canary-serving-cert" not found Apr 16 10:05:09.450387 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.450323 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:41.450307115 +0000 UTC m=+66.230716296 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 10:05:09.450387 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450269 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh89p\" (UniqueName: \"kubernetes.io/projected/f8ed2d4a-7837-47c0-aa78-9cb656214b23-kube-api-access-nh89p\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.450387 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8ed2d4a-7837-47c0-aa78-9cb656214b23-config-volume\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.450775 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8ed2d4a-7837-47c0-aa78-9cb656214b23-tmp-dir\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.450775 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.450222 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:09.450775 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.450473 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls podName:f8ed2d4a-7837-47c0-aa78-9cb656214b23 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:09.950461176 +0000 UTC m=+34.730870342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls") pod "dns-default-4hw98" (UID: "f8ed2d4a-7837-47c0-aa78-9cb656214b23") : secret "dns-default-metrics-tls" not found Apr 16 10:05:09.450920 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450777 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8ed2d4a-7837-47c0-aa78-9cb656214b23-tmp-dir\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.450973 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.450936 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8ed2d4a-7837-47c0-aa78-9cb656214b23-config-volume\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.452100 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.452067 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:09.459168 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.459144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh89p\" (UniqueName: \"kubernetes.io/projected/f8ed2d4a-7837-47c0-aa78-9cb656214b23-kube-api-access-nh89p\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.459290 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.459254 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brcq6\" (UniqueName: \"kubernetes.io/projected/9007958c-4532-4e2d-a3b7-94207d6d562a-kube-api-access-brcq6\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:09.471803 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.471782 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:05:09.551092 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.551053 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:09.551242 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.551214 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 10:05:09.551242 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.551241 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 10:05:09.551331 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.551252 2570 projected.go:194] Error preparing data for projected volume kube-api-access-mhjtp for pod openshift-network-diagnostics/network-check-target-2h7wk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:09.551331 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.551312 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp podName:dba5a2c9-dd08-4be6-a76d-85742ada944e nodeName:}" failed. No retries permitted until 2026-04-16 10:05:41.551299143 +0000 UTC m=+66.331708307 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mhjtp" (UniqueName: "kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp") pod "network-check-target-2h7wk" (UID: "dba5a2c9-dd08-4be6-a76d-85742ada944e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 10:05:09.854067 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.854024 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:09.854521 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.854181 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 10:05:09.854521 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.854191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:09.854521 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.854263 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert podName:b74f7a44-b523-4723-8224-aa8bc5a5759c nodeName:}" failed. No retries permitted until 2026-04-16 10:05:10.854242025 +0000 UTC m=+35.634651203 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-qjjn5" (UID: "b74f7a44-b523-4723-8224-aa8bc5a5759c") : secret "networking-console-plugin-cert" not found Apr 16 10:05:09.854521 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.854322 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:05:09.854521 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.854337 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c9cc7857-lwqxr: secret "image-registry-tls" not found Apr 16 10:05:09.854521 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.854389 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls podName:4b799a1d-5b70-47ff-b95d-e1cc88088ee8 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:10.854374651 +0000 UTC m=+35.634783817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls") pod "image-registry-69c9cc7857-lwqxr" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8") : secret "image-registry-tls" not found Apr 16 10:05:09.955653 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.955615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:09.955839 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.955681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:09.955839 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:09.955769 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:09.955839 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.955779 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:09.955839 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.955815 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 10:05:09.956052 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.955852 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls podName:f8ed2d4a-7837-47c0-aa78-9cb656214b23 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:10.955832822 +0000 UTC m=+35.736241988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls") pod "dns-default-4hw98" (UID: "f8ed2d4a-7837-47c0-aa78-9cb656214b23") : secret "dns-default-metrics-tls" not found Apr 16 10:05:09.956052 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.955870 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret podName:e9fc47e9-a934-4e31-b64d-70aef48ffd3e nodeName:}" failed. No retries permitted until 2026-04-16 10:05:41.9558625 +0000 UTC m=+66.736271666 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret") pod "global-pull-secret-syncer-znlvq" (UID: "e9fc47e9-a934-4e31-b64d-70aef48ffd3e") : object "kube-system"/"original-pull-secret" not registered Apr 16 10:05:09.956052 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.955902 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:09.956052 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:09.955959 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert podName:9007958c-4532-4e2d-a3b7-94207d6d562a nodeName:}" failed. No retries permitted until 2026-04-16 10:05:10.955943289 +0000 UTC m=+35.736352457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert") pod "ingress-canary-dwt6x" (UID: "9007958c-4532-4e2d-a3b7-94207d6d562a") : secret "canary-serving-cert" not found Apr 16 10:05:10.736712 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.736681 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p"] Apr 16 10:05:10.740565 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.740479 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5"] Apr 16 10:05:10.740565 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.740540 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw"] Apr 16 10:05:10.774414 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:05:10.774382 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2919cf8_ac8b_4c58_be68_19e8c41e0c82.slice/crio-5dd43a4fbccf9b4bdf21ba83330ed13d14d430abd2641018ad5c9256b943e13b WatchSource:0}: Error finding container 5dd43a4fbccf9b4bdf21ba83330ed13d14d430abd2641018ad5c9256b943e13b: Status 404 returned error can't find the container with id 5dd43a4fbccf9b4bdf21ba83330ed13d14d430abd2641018ad5c9256b943e13b Apr 16 10:05:10.775123 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:05:10.775101 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded278555_41a2_4c45_aac6_3d9454782e11.slice/crio-8d53f0d3ad2d82614ef3c3d2ff85c7b34a194be0852ba69d80407c9edeb7ffae WatchSource:0}: Error finding container 8d53f0d3ad2d82614ef3c3d2ff85c7b34a194be0852ba69d80407c9edeb7ffae: Status 404 returned error can't find the container with id 8d53f0d3ad2d82614ef3c3d2ff85c7b34a194be0852ba69d80407c9edeb7ffae Apr 16 10:05:10.780780 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:05:10.780759 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod498de4dd_364c_48a3_a4a8_9b4d0bce5ce1.slice/crio-06e28aef031c9a24fbad1e80ea94b4dba0a678bb1c5193d52fe9b3dc42f90a9d WatchSource:0}: Error finding container 06e28aef031c9a24fbad1e80ea94b4dba0a678bb1c5193d52fe9b3dc42f90a9d: Status 404 returned error can't find the container with id 06e28aef031c9a24fbad1e80ea94b4dba0a678bb1c5193d52fe9b3dc42f90a9d Apr 16 10:05:10.794148 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.794089 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:10.794148 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.794102 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:10.794148 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.794122 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:05:10.796419 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.796389 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 10:05:10.796517 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.796399 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 10:05:10.796608 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.796519 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 10:05:10.796680 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.796663 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 10:05:10.796735 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.796717 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-btvgh\"" Apr 16 10:05:10.796735 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.796725 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mvsr8\"" Apr 16 10:05:10.865607 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.865573 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:10.866441 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.865669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:10.866441 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.865833 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:05:10.866441 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.865851 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c9cc7857-lwqxr: secret "image-registry-tls" not found Apr 16 10:05:10.866441 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.865915 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls podName:4b799a1d-5b70-47ff-b95d-e1cc88088ee8 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:12.865897187 +0000 UTC m=+37.646306367 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls") pod "image-registry-69c9cc7857-lwqxr" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8") : secret "image-registry-tls" not found Apr 16 10:05:10.866441 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.865938 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 10:05:10.866441 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.866012 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert podName:b74f7a44-b523-4723-8224-aa8bc5a5759c nodeName:}" failed. No retries permitted until 2026-04-16 10:05:12.865995392 +0000 UTC m=+37.646404570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-qjjn5" (UID: "b74f7a44-b523-4723-8224-aa8bc5a5759c") : secret "networking-console-plugin-cert" not found Apr 16 10:05:10.966517 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.966336 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:10.966675 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.966559 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:10.966675 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.966494 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:10.966675 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.966648 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert podName:9007958c-4532-4e2d-a3b7-94207d6d562a nodeName:}" failed. No retries permitted until 2026-04-16 10:05:12.966627434 +0000 UTC m=+37.747036606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert") pod "ingress-canary-dwt6x" (UID: "9007958c-4532-4e2d-a3b7-94207d6d562a") : secret "canary-serving-cert" not found Apr 16 10:05:10.966801 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.966734 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:10.966801 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:10.966781 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls podName:f8ed2d4a-7837-47c0-aa78-9cb656214b23 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:12.96676981 +0000 UTC m=+37.747178975 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls") pod "dns-default-4hw98" (UID: "f8ed2d4a-7837-47c0-aa78-9cb656214b23") : secret "dns-default-metrics-tls" not found Apr 16 10:05:10.985172 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.985143 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" event={"ID":"e2919cf8-ac8b-4c58-be68-19e8c41e0c82","Type":"ContainerStarted","Data":"5dd43a4fbccf9b4bdf21ba83330ed13d14d430abd2641018ad5c9256b943e13b"} Apr 16 10:05:10.987662 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.987640 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerStarted","Data":"f689a913800a3156d70233580f148f7680058c5cc016d5575ae89578f05d7b69"} Apr 16 10:05:10.988634 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.988614 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" event={"ID":"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1","Type":"ContainerStarted","Data":"06e28aef031c9a24fbad1e80ea94b4dba0a678bb1c5193d52fe9b3dc42f90a9d"} Apr 16 10:05:10.989596 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:10.989574 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" event={"ID":"ed278555-41a2-4c45-aac6-3d9454782e11","Type":"ContainerStarted","Data":"8d53f0d3ad2d82614ef3c3d2ff85c7b34a194be0852ba69d80407c9edeb7ffae"} Apr 16 10:05:12.002112 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:12.001212 2570 generic.go:358] "Generic (PLEG): container finished" podID="191a958c-a1a7-4e33-9456-1f482a72fb5e" containerID="f689a913800a3156d70233580f148f7680058c5cc016d5575ae89578f05d7b69" exitCode=0 Apr 16 10:05:12.002112 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:12.001260 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerDied","Data":"f689a913800a3156d70233580f148f7680058c5cc016d5575ae89578f05d7b69"} Apr 16 10:05:12.886356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:12.886221 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:12.886520 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.886421 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:05:12.886520 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.886449 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c9cc7857-lwqxr: secret "image-registry-tls" not found Apr 16 10:05:12.886520 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.886514 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls podName:4b799a1d-5b70-47ff-b95d-e1cc88088ee8 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:16.886493582 +0000 UTC m=+41.666902771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls") pod "image-registry-69c9cc7857-lwqxr" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8") : secret "image-registry-tls" not found Apr 16 10:05:12.887118 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:12.886838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:12.887118 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.887040 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 10:05:12.887118 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.887092 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert podName:b74f7a44-b523-4723-8224-aa8bc5a5759c nodeName:}" failed. No retries permitted until 2026-04-16 10:05:16.887076286 +0000 UTC m=+41.667485456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-qjjn5" (UID: "b74f7a44-b523-4723-8224-aa8bc5a5759c") : secret "networking-console-plugin-cert" not found Apr 16 10:05:12.988129 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:12.988089 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:12.988297 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:12.988236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:12.988732 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.988424 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:12.988732 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.988489 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls podName:f8ed2d4a-7837-47c0-aa78-9cb656214b23 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:16.988470701 +0000 UTC m=+41.768879870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls") pod "dns-default-4hw98" (UID: "f8ed2d4a-7837-47c0-aa78-9cb656214b23") : secret "dns-default-metrics-tls" not found Apr 16 10:05:12.989469 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.989377 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:12.989469 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:12.989444 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert podName:9007958c-4532-4e2d-a3b7-94207d6d562a nodeName:}" failed. No retries permitted until 2026-04-16 10:05:16.989425085 +0000 UTC m=+41.769834252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert") pod "ingress-canary-dwt6x" (UID: "9007958c-4532-4e2d-a3b7-94207d6d562a") : secret "canary-serving-cert" not found Apr 16 10:05:13.016807 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:13.012439 2570 generic.go:358] "Generic (PLEG): container finished" podID="191a958c-a1a7-4e33-9456-1f482a72fb5e" containerID="c4510362b40bcac42bd29e7f27f49aec8830cda240ded5b9f922944de4bc8a30" exitCode=0 Apr 16 10:05:13.016807 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:13.012563 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerDied","Data":"c4510362b40bcac42bd29e7f27f49aec8830cda240ded5b9f922944de4bc8a30"} Apr 16 10:05:16.921285 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:16.921249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:16.921650 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:16.921350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:16.921650 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:16.921379 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:05:16.921650 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:16.921400 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c9cc7857-lwqxr: secret "image-registry-tls" not found Apr 16 10:05:16.921650 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:16.921465 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls podName:4b799a1d-5b70-47ff-b95d-e1cc88088ee8 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:24.921449721 +0000 UTC m=+49.701858889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls") pod "image-registry-69c9cc7857-lwqxr" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8") : secret "image-registry-tls" not found Apr 16 10:05:16.921650 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:16.921468 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 10:05:16.921650 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:16.921553 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert podName:b74f7a44-b523-4723-8224-aa8bc5a5759c nodeName:}" failed. No retries permitted until 2026-04-16 10:05:24.921523528 +0000 UTC m=+49.701932717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-qjjn5" (UID: "b74f7a44-b523-4723-8224-aa8bc5a5759c") : secret "networking-console-plugin-cert" not found Apr 16 10:05:17.021911 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:17.021886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:17.022018 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:17.022008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:17.022081 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:17.022024 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:17.022122 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:17.022106 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:17.022122 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:17.022121 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls podName:f8ed2d4a-7837-47c0-aa78-9cb656214b23 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:25.022102245 +0000 UTC m=+49.802511423 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls") pod "dns-default-4hw98" (UID: "f8ed2d4a-7837-47c0-aa78-9cb656214b23") : secret "dns-default-metrics-tls" not found Apr 16 10:05:17.022207 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:17.022149 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert podName:9007958c-4532-4e2d-a3b7-94207d6d562a nodeName:}" failed. No retries permitted until 2026-04-16 10:05:25.022137859 +0000 UTC m=+49.802547025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert") pod "ingress-canary-dwt6x" (UID: "9007958c-4532-4e2d-a3b7-94207d6d562a") : secret "canary-serving-cert" not found Apr 16 10:05:18.024220 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.024185 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" event={"ID":"e2919cf8-ac8b-4c58-be68-19e8c41e0c82","Type":"ContainerStarted","Data":"0d00e204b41cf6aeb5d2eba7a623ec2e6fbff976e7ce1ebc72feb2389e8296f5"} Apr 16 10:05:18.027016 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.026986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9smg" event={"ID":"191a958c-a1a7-4e33-9456-1f482a72fb5e","Type":"ContainerStarted","Data":"ad31ea0ef045fed97abacb3547a5eed081ce6e72003d6931973ee5cb5910e6ff"} Apr 16 10:05:18.028199 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.028179 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" event={"ID":"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1","Type":"ContainerStarted","Data":"9ce08f831e6b732194f5a0f52f1ce927d115a5b9c73402a57e7e1a701973a71d"} Apr 16 10:05:18.029397 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.029373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" event={"ID":"ed278555-41a2-4c45-aac6-3d9454782e11","Type":"ContainerStarted","Data":"60fc1eed7938c17c63af34dea17e0e8354949fb99c32c58b2b5f6ca310e6765d"} Apr 16 10:05:18.029572 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.029551 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:18.031096 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.031079 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:05:18.038631 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.038589 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" podStartSLOduration=15.803967895 podStartE2EDuration="22.038578392s" podCreationTimestamp="2026-04-16 10:04:56 +0000 UTC" firstStartedPulling="2026-04-16 10:05:10.77667628 +0000 UTC m=+35.557085444" lastFinishedPulling="2026-04-16 10:05:17.011286765 +0000 UTC m=+41.791695941" observedRunningTime="2026-04-16 10:05:18.037935079 +0000 UTC m=+42.818344266" watchObservedRunningTime="2026-04-16 10:05:18.038578392 +0000 UTC m=+42.818987580" Apr 16 10:05:18.057709 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.057668 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v9smg" podStartSLOduration=10.251627814 podStartE2EDuration="43.057656967s" podCreationTimestamp="2026-04-16 10:04:35 +0000 UTC" firstStartedPulling="2026-04-16 10:04:38.003916095 +0000 UTC m=+2.784325260" lastFinishedPulling="2026-04-16 10:05:10.809945244 +0000 UTC m=+35.590354413" observedRunningTime="2026-04-16 10:05:18.056038155 +0000 UTC m=+42.836447342" watchObservedRunningTime="2026-04-16 10:05:18.057656967 +0000 UTC m=+42.838066154" Apr 16 10:05:18.070544 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:18.070491 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" podStartSLOduration=15.836057186 podStartE2EDuration="22.070481157s" podCreationTimestamp="2026-04-16 10:04:56 +0000 UTC" firstStartedPulling="2026-04-16 10:05:10.776866267 +0000 UTC m=+35.557275432" lastFinishedPulling="2026-04-16 10:05:17.011290226 +0000 UTC m=+41.791699403" observedRunningTime="2026-04-16 10:05:18.06985913 +0000 UTC m=+42.850268341" watchObservedRunningTime="2026-04-16 10:05:18.070481157 +0000 UTC m=+42.850890322" Apr 16 10:05:21.035948 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:21.035922 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" event={"ID":"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1","Type":"ContainerStarted","Data":"abb0749045029af3c63bba339b92bf87b34d89f95436fadfafc56742e822bb97"} Apr 16 10:05:22.039148 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:22.039114 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" event={"ID":"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1","Type":"ContainerStarted","Data":"f6e804b33608cb110a6d9454a7489c10b5242793bbe7f7259fd3ada5b707dfe1"} Apr 16 10:05:22.057609 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:22.057565 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" podStartSLOduration=15.945381525 podStartE2EDuration="26.057551659s" podCreationTimestamp="2026-04-16 10:04:56 +0000 UTC" firstStartedPulling="2026-04-16 10:05:10.786013352 +0000 UTC m=+35.566422517" lastFinishedPulling="2026-04-16 10:05:20.898183485 +0000 UTC m=+45.678592651" observedRunningTime="2026-04-16 10:05:22.056664102 +0000 UTC m=+46.837073299" watchObservedRunningTime="2026-04-16 10:05:22.057551659 +0000 UTC m=+46.837960841" Apr 16 10:05:24.978870 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:24.978834 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:24.979275 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:24.978924 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:24.979275 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:24.978988 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:05:24.979275 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:24.979006 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 10:05:24.979275 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:24.979068 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert podName:b74f7a44-b523-4723-8224-aa8bc5a5759c nodeName:}" failed. No retries permitted until 2026-04-16 10:05:40.979054815 +0000 UTC m=+65.759463980 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-qjjn5" (UID: "b74f7a44-b523-4723-8224-aa8bc5a5759c") : secret "networking-console-plugin-cert" not found Apr 16 10:05:24.979275 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:24.979008 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c9cc7857-lwqxr: secret "image-registry-tls" not found Apr 16 10:05:24.979275 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:24.979143 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls podName:4b799a1d-5b70-47ff-b95d-e1cc88088ee8 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:40.979128977 +0000 UTC m=+65.759538144 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls") pod "image-registry-69c9cc7857-lwqxr" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8") : secret "image-registry-tls" not found Apr 16 10:05:25.079366 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:25.079333 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:25.079593 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:25.079431 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:25.079593 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:25.079483 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:25.079593 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:25.079560 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:25.079593 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:25.079562 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls podName:f8ed2d4a-7837-47c0-aa78-9cb656214b23 nodeName:}" failed. No retries permitted until 2026-04-16 10:05:41.079544239 +0000 UTC m=+65.859953408 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls") pod "dns-default-4hw98" (UID: "f8ed2d4a-7837-47c0-aa78-9cb656214b23") : secret "dns-default-metrics-tls" not found Apr 16 10:05:25.079734 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:25.079609 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert podName:9007958c-4532-4e2d-a3b7-94207d6d562a nodeName:}" failed. No retries permitted until 2026-04-16 10:05:41.079597494 +0000 UTC m=+65.860006659 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert") pod "ingress-canary-dwt6x" (UID: "9007958c-4532-4e2d-a3b7-94207d6d562a") : secret "canary-serving-cert" not found Apr 16 10:05:34.983155 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:34.983124 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmhhw" Apr 16 10:05:41.008791 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.008748 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:05:41.009240 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.008839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:05:41.009240 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.008918 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 10:05:41.009240 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.008991 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:05:41.009240 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.009011 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c9cc7857-lwqxr: secret "image-registry-tls" not found Apr 16 10:05:41.009240 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.008999 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert podName:b74f7a44-b523-4723-8224-aa8bc5a5759c nodeName:}" failed. No retries permitted until 2026-04-16 10:06:13.008981689 +0000 UTC m=+97.789390861 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-qjjn5" (UID: "b74f7a44-b523-4723-8224-aa8bc5a5759c") : secret "networking-console-plugin-cert" not found Apr 16 10:05:41.009240 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.009072 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls podName:4b799a1d-5b70-47ff-b95d-e1cc88088ee8 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:13.009058056 +0000 UTC m=+97.789467221 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls") pod "image-registry-69c9cc7857-lwqxr" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8") : secret "image-registry-tls" not found Apr 16 10:05:41.109925 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.109894 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:05:41.110080 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.109941 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:05:41.110080 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.110027 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:05:41.110150 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.110086 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert podName:9007958c-4532-4e2d-a3b7-94207d6d562a nodeName:}" failed. No retries permitted until 2026-04-16 10:06:13.110072696 +0000 UTC m=+97.890481860 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert") pod "ingress-canary-dwt6x" (UID: "9007958c-4532-4e2d-a3b7-94207d6d562a") : secret "canary-serving-cert" not found Apr 16 10:05:41.110150 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.110028 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:05:41.110218 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.110174 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls podName:f8ed2d4a-7837-47c0-aa78-9cb656214b23 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:13.110163279 +0000 UTC m=+97.890572443 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls") pod "dns-default-4hw98" (UID: "f8ed2d4a-7837-47c0-aa78-9cb656214b23") : secret "dns-default-metrics-tls" not found Apr 16 10:05:41.512901 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.512861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:05:41.515134 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.515118 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 10:05:41.524001 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.523982 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 10:05:41.524074 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:05:41.524043 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:06:45.524023328 +0000 UTC m=+130.304432493 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : secret "metrics-daemon-secret" not found Apr 16 10:05:41.614185 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.614151 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:41.616665 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.616645 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 10:05:41.627017 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.626999 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 10:05:41.639036 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.639007 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhjtp\" (UniqueName: \"kubernetes.io/projected/dba5a2c9-dd08-4be6-a76d-85742ada944e-kube-api-access-mhjtp\") pod \"network-check-target-2h7wk\" (UID: \"dba5a2c9-dd08-4be6-a76d-85742ada944e\") " pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:41.707317 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.707285 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-mvsr8\"" Apr 16 10:05:41.715286 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.715261 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:41.840696 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:41.840661 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2h7wk"] Apr 16 10:05:41.843658 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:05:41.843622 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddba5a2c9_dd08_4be6_a76d_85742ada944e.slice/crio-13be8800139141fd3e219e71ce5aad811089e90ea0b397a35c63d568673cf7bd WatchSource:0}: Error finding container 13be8800139141fd3e219e71ce5aad811089e90ea0b397a35c63d568673cf7bd: Status 404 returned error can't find the container with id 13be8800139141fd3e219e71ce5aad811089e90ea0b397a35c63d568673cf7bd Apr 16 10:05:42.017816 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:42.017784 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:42.020167 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:42.020146 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 10:05:42.030538 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:42.030476 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9fc47e9-a934-4e31-b64d-70aef48ffd3e-original-pull-secret\") pod \"global-pull-secret-syncer-znlvq\" (UID: \"e9fc47e9-a934-4e31-b64d-70aef48ffd3e\") " pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:42.077891 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:42.077859 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2h7wk" event={"ID":"dba5a2c9-dd08-4be6-a76d-85742ada944e","Type":"ContainerStarted","Data":"13be8800139141fd3e219e71ce5aad811089e90ea0b397a35c63d568673cf7bd"} Apr 16 10:05:42.312249 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:42.312168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-znlvq" Apr 16 10:05:42.442394 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:42.442364 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-znlvq"] Apr 16 10:05:42.445819 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:05:42.445784 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9fc47e9_a934_4e31_b64d_70aef48ffd3e.slice/crio-681296a75ca86b500a6f3ba045ea0cc446ef24f72da5e3d89af65a57f5720dd6 WatchSource:0}: Error finding container 681296a75ca86b500a6f3ba045ea0cc446ef24f72da5e3d89af65a57f5720dd6: Status 404 returned error can't find the container with id 681296a75ca86b500a6f3ba045ea0cc446ef24f72da5e3d89af65a57f5720dd6 Apr 16 10:05:43.081470 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:43.081418 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-znlvq" event={"ID":"e9fc47e9-a934-4e31-b64d-70aef48ffd3e","Type":"ContainerStarted","Data":"681296a75ca86b500a6f3ba045ea0cc446ef24f72da5e3d89af65a57f5720dd6"} Apr 16 10:05:47.091242 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:47.091201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-znlvq" event={"ID":"e9fc47e9-a934-4e31-b64d-70aef48ffd3e","Type":"ContainerStarted","Data":"23010ed4b08f6bcebc044666490bf75a9159fe3cf4b9f8e7d77dba5869ebd43d"} Apr 16 10:05:47.092489 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:47.092464 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2h7wk" event={"ID":"dba5a2c9-dd08-4be6-a76d-85742ada944e","Type":"ContainerStarted","Data":"8c88b0748ecd1bbafe5cdd710e269eabf379ca15da265ec9bdcf48ac2f9218c8"} Apr 16 10:05:47.092637 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:47.092572 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:05:47.106812 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:47.106773 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-znlvq" podStartSLOduration=64.959966256 podStartE2EDuration="1m9.106760499s" podCreationTimestamp="2026-04-16 10:04:38 +0000 UTC" firstStartedPulling="2026-04-16 10:05:42.447699125 +0000 UTC m=+67.228108290" lastFinishedPulling="2026-04-16 10:05:46.594493363 +0000 UTC m=+71.374902533" observedRunningTime="2026-04-16 10:05:47.105920937 +0000 UTC m=+71.886330130" watchObservedRunningTime="2026-04-16 10:05:47.106760499 +0000 UTC m=+71.887169664" Apr 16 10:05:47.145980 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:05:47.145936 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2h7wk" podStartSLOduration=67.407988897 podStartE2EDuration="1m12.145923221s" podCreationTimestamp="2026-04-16 10:04:35 +0000 UTC" firstStartedPulling="2026-04-16 10:05:41.845626334 +0000 UTC m=+66.626035499" lastFinishedPulling="2026-04-16 10:05:46.583560652 +0000 UTC m=+71.363969823" observedRunningTime="2026-04-16 10:05:47.141709641 +0000 UTC m=+71.922118839" watchObservedRunningTime="2026-04-16 10:05:47.145923221 +0000 UTC m=+71.926332408" Apr 16 10:06:13.058307 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:06:13.058271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:06:13.058706 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:06:13.058341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:06:13.058706 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.058425 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 10:06:13.058706 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.058445 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 10:06:13.058706 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.058455 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c9cc7857-lwqxr: secret "image-registry-tls" not found Apr 16 10:06:13.058706 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.058493 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert podName:b74f7a44-b523-4723-8224-aa8bc5a5759c nodeName:}" failed. No retries permitted until 2026-04-16 10:07:17.058477569 +0000 UTC m=+161.838886734 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-qjjn5" (UID: "b74f7a44-b523-4723-8224-aa8bc5a5759c") : secret "networking-console-plugin-cert" not found Apr 16 10:06:13.058706 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.058508 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls podName:4b799a1d-5b70-47ff-b95d-e1cc88088ee8 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:17.058501763 +0000 UTC m=+161.838910928 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls") pod "image-registry-69c9cc7857-lwqxr" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8") : secret "image-registry-tls" not found Apr 16 10:06:13.159693 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:06:13.159658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:06:13.159864 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:06:13.159718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:06:13.159864 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.159793 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 10:06:13.159864 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.159853 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert podName:9007958c-4532-4e2d-a3b7-94207d6d562a nodeName:}" failed. No retries permitted until 2026-04-16 10:07:17.159838769 +0000 UTC m=+161.940247935 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert") pod "ingress-canary-dwt6x" (UID: "9007958c-4532-4e2d-a3b7-94207d6d562a") : secret "canary-serving-cert" not found Apr 16 10:06:13.159864 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.159855 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 10:06:13.160014 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:13.159896 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls podName:f8ed2d4a-7837-47c0-aa78-9cb656214b23 nodeName:}" failed. No retries permitted until 2026-04-16 10:07:17.15988528 +0000 UTC m=+161.940294449 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls") pod "dns-default-4hw98" (UID: "f8ed2d4a-7837-47c0-aa78-9cb656214b23") : secret "dns-default-metrics-tls" not found Apr 16 10:06:18.098332 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:06:18.098304 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2h7wk" Apr 16 10:06:45.604843 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:06:45.604796 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:06:45.605339 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:45.604955 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 10:06:45.605339 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:06:45.605024 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs podName:bccbed26-7fad-44dd-b120-4fe2758154e5 nodeName:}" failed. No retries permitted until 2026-04-16 10:08:47.605009767 +0000 UTC m=+252.385418938 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs") pod "network-metrics-daemon-9mqb2" (UID: "bccbed26-7fad-44dd-b120-4fe2758154e5") : secret "metrics-daemon-secret" not found Apr 16 10:06:53.563723 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:06:53.563697 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rj4q8_fc1a6ffe-e50d-41c1-aa12-db0a4d10232f/dns-node-resolver/0.log" Apr 16 10:06:54.760327 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:06:54.760299 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v8sh7_4a61fb09-0793-4b69-b34c-784caf0249e5/node-ca/0.log" Apr 16 10:07:12.085472 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:07:12.085408 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" podUID="b74f7a44-b523-4723-8224-aa8bc5a5759c" Apr 16 10:07:12.112779 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:07:12.112756 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" podUID="4b799a1d-5b70-47ff-b95d-e1cc88088ee8" Apr 16 10:07:12.191226 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:07:12.191188 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dwt6x" podUID="9007958c-4532-4e2d-a3b7-94207d6d562a" Apr 16 10:07:12.205336 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:07:12.205308 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4hw98" podUID="f8ed2d4a-7837-47c0-aa78-9cb656214b23" Apr 16 10:07:12.292298 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:12.292268 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:07:12.292430 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:12.292317 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:07:12.292430 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:12.292388 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:07:12.292574 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:12.292555 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4hw98" Apr 16 10:07:13.816273 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:07:13.816223 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9mqb2" podUID="bccbed26-7fad-44dd-b120-4fe2758154e5" Apr 16 10:07:13.938081 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:13.938046 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xf8wh"] Apr 16 10:07:13.941075 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:13.941053 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:13.943195 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:13.943171 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 10:07:13.943307 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:13.943249 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vnpkd\"" Apr 16 10:07:13.943968 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:13.943952 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 10:07:13.943968 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:13.943966 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 10:07:13.944164 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:13.944145 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 10:07:13.954158 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:13.954136 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xf8wh"] Apr 16 10:07:14.020693 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.020659 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/120ed0a9-f7ea-4130-a3dd-d983418dbacb-data-volume\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.020854 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.020702 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8wpd\" (UniqueName: \"kubernetes.io/projected/120ed0a9-f7ea-4130-a3dd-d983418dbacb-kube-api-access-n8wpd\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.020854 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.020722 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/120ed0a9-f7ea-4130-a3dd-d983418dbacb-crio-socket\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.020854 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.020745 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/120ed0a9-f7ea-4130-a3dd-d983418dbacb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.020854 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.020776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/120ed0a9-f7ea-4130-a3dd-d983418dbacb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.121555 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.121457 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8wpd\" (UniqueName: \"kubernetes.io/projected/120ed0a9-f7ea-4130-a3dd-d983418dbacb-kube-api-access-n8wpd\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.121555 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.121496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/120ed0a9-f7ea-4130-a3dd-d983418dbacb-crio-socket\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.121555 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.121515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/120ed0a9-f7ea-4130-a3dd-d983418dbacb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.121770 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.121561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/120ed0a9-f7ea-4130-a3dd-d983418dbacb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.121770 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.121614 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/120ed0a9-f7ea-4130-a3dd-d983418dbacb-crio-socket\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.121770 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.121668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/120ed0a9-f7ea-4130-a3dd-d983418dbacb-data-volume\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.122015 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.122000 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/120ed0a9-f7ea-4130-a3dd-d983418dbacb-data-volume\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.122132 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.122115 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/120ed0a9-f7ea-4130-a3dd-d983418dbacb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.123909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.123880 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/120ed0a9-f7ea-4130-a3dd-d983418dbacb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.130934 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.130909 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8wpd\" (UniqueName: \"kubernetes.io/projected/120ed0a9-f7ea-4130-a3dd-d983418dbacb-kube-api-access-n8wpd\") pod \"insights-runtime-extractor-xf8wh\" (UID: \"120ed0a9-f7ea-4130-a3dd-d983418dbacb\") " pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.251008 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.250951 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xf8wh" Apr 16 10:07:14.367311 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:14.367221 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xf8wh"] Apr 16 10:07:14.370142 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:07:14.370106 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod120ed0a9_f7ea_4130_a3dd_d983418dbacb.slice/crio-7f1585f4efa852c776733041619f4c2519887c847fd49498a12e9746ef5cbca5 WatchSource:0}: Error finding container 7f1585f4efa852c776733041619f4c2519887c847fd49498a12e9746ef5cbca5: Status 404 returned error can't find the container with id 7f1585f4efa852c776733041619f4c2519887c847fd49498a12e9746ef5cbca5 Apr 16 10:07:15.307088 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:15.307046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xf8wh" event={"ID":"120ed0a9-f7ea-4130-a3dd-d983418dbacb","Type":"ContainerStarted","Data":"daa011ff0752a30609f8782e90560477e1cd4fbb04c35af9464d0ad14c1e172a"} Apr 16 10:07:15.307088 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:15.307087 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xf8wh" event={"ID":"120ed0a9-f7ea-4130-a3dd-d983418dbacb","Type":"ContainerStarted","Data":"76f75272a5a1c94d9498042273af927da2e98c854726c129198d3e8806e3710a"} Apr 16 10:07:15.307088 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:15.307097 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xf8wh" event={"ID":"120ed0a9-f7ea-4130-a3dd-d983418dbacb","Type":"ContainerStarted","Data":"7f1585f4efa852c776733041619f4c2519887c847fd49498a12e9746ef5cbca5"} Apr 16 10:07:17.145139 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.145096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:07:17.145567 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.145170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:07:17.147482 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.147457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"image-registry-69c9cc7857-lwqxr\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:07:17.147582 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.147483 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b74f7a44-b523-4723-8224-aa8bc5a5759c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-qjjn5\" (UID: \"b74f7a44-b523-4723-8224-aa8bc5a5759c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:07:17.245893 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.245852 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:07:17.246042 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.245909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:07:17.248182 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.248158 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8ed2d4a-7837-47c0-aa78-9cb656214b23-metrics-tls\") pod \"dns-default-4hw98\" (UID: \"f8ed2d4a-7837-47c0-aa78-9cb656214b23\") " pod="openshift-dns/dns-default-4hw98" Apr 16 10:07:17.248353 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.248329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9007958c-4532-4e2d-a3b7-94207d6d562a-cert\") pod \"ingress-canary-dwt6x\" (UID: \"9007958c-4532-4e2d-a3b7-94207d6d562a\") " pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:07:17.313163 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.313138 2570 generic.go:358] "Generic (PLEG): container finished" podID="e2919cf8-ac8b-4c58-be68-19e8c41e0c82" containerID="0d00e204b41cf6aeb5d2eba7a623ec2e6fbff976e7ce1ebc72feb2389e8296f5" exitCode=255 Apr 16 10:07:17.313272 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.313210 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" event={"ID":"e2919cf8-ac8b-4c58-be68-19e8c41e0c82","Type":"ContainerDied","Data":"0d00e204b41cf6aeb5d2eba7a623ec2e6fbff976e7ce1ebc72feb2389e8296f5"} Apr 16 10:07:17.313563 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.313523 2570 scope.go:117] "RemoveContainer" containerID="0d00e204b41cf6aeb5d2eba7a623ec2e6fbff976e7ce1ebc72feb2389e8296f5" Apr 16 10:07:17.315054 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.315028 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xf8wh" event={"ID":"120ed0a9-f7ea-4130-a3dd-d983418dbacb","Type":"ContainerStarted","Data":"d4a84f51851b0d8f78d4c733afc487a952b9277a1138b8f00c48d5db92163859"} Apr 16 10:07:17.316344 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.316322 2570 generic.go:358] "Generic (PLEG): container finished" podID="ed278555-41a2-4c45-aac6-3d9454782e11" containerID="60fc1eed7938c17c63af34dea17e0e8354949fb99c32c58b2b5f6ca310e6765d" exitCode=1 Apr 16 10:07:17.316454 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.316367 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" event={"ID":"ed278555-41a2-4c45-aac6-3d9454782e11","Type":"ContainerDied","Data":"60fc1eed7938c17c63af34dea17e0e8354949fb99c32c58b2b5f6ca310e6765d"} Apr 16 10:07:17.316703 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.316687 2570 scope.go:117] "RemoveContainer" containerID="60fc1eed7938c17c63af34dea17e0e8354949fb99c32c58b2b5f6ca310e6765d" Apr 16 10:07:17.348045 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.348002 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xf8wh" podStartSLOduration=2.102908719 podStartE2EDuration="4.347989628s" podCreationTimestamp="2026-04-16 10:07:13 +0000 UTC" firstStartedPulling="2026-04-16 10:07:14.419259234 +0000 UTC m=+159.199668399" lastFinishedPulling="2026-04-16 10:07:16.664340144 +0000 UTC m=+161.444749308" observedRunningTime="2026-04-16 10:07:17.347329043 +0000 UTC m=+162.127738254" watchObservedRunningTime="2026-04-16 10:07:17.347989628 +0000 UTC m=+162.128398814" Apr 16 10:07:17.395461 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.395377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kpfjj\"" Apr 16 10:07:17.395604 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.395584 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ccsn6\"" Apr 16 10:07:17.395730 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.395711 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-652wv\"" Apr 16 10:07:17.395796 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.395778 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cwtl6\"" Apr 16 10:07:17.403644 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.403607 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:07:17.403644 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.403627 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4hw98" Apr 16 10:07:17.403807 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.403651 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" Apr 16 10:07:17.403807 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.403799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dwt6x" Apr 16 10:07:17.572106 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.571920 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4hw98"] Apr 16 10:07:17.578210 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:07:17.578164 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8ed2d4a_7837_47c0_aa78_9cb656214b23.slice/crio-4d33414496ac2b2e94bac4890a720861bba2f6088429659ffcbb0b647e6a2dc1 WatchSource:0}: Error finding container 4d33414496ac2b2e94bac4890a720861bba2f6088429659ffcbb0b647e6a2dc1: Status 404 returned error can't find the container with id 4d33414496ac2b2e94bac4890a720861bba2f6088429659ffcbb0b647e6a2dc1 Apr 16 10:07:17.589282 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.589256 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dwt6x"] Apr 16 10:07:17.592478 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:07:17.592449 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9007958c_4532_4e2d_a3b7_94207d6d562a.slice/crio-62eb26e5e238541cd6c0aa84fab522b9b51d42181abe7a44f7f977ae18c8286c WatchSource:0}: Error finding container 62eb26e5e238541cd6c0aa84fab522b9b51d42181abe7a44f7f977ae18c8286c: Status 404 returned error can't find the container with id 62eb26e5e238541cd6c0aa84fab522b9b51d42181abe7a44f7f977ae18c8286c Apr 16 10:07:17.813916 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.813888 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5"] Apr 16 10:07:17.817110 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:07:17.817084 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74f7a44_b523_4723_8224_aa8bc5a5759c.slice/crio-62ae6ad0908c4066b5dc4ae2c675beb1510822dcf9c02fd22a734eeb49a9326d WatchSource:0}: Error finding container 62ae6ad0908c4066b5dc4ae2c675beb1510822dcf9c02fd22a734eeb49a9326d: Status 404 returned error can't find the container with id 62ae6ad0908c4066b5dc4ae2c675beb1510822dcf9c02fd22a734eeb49a9326d Apr 16 10:07:17.820813 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:17.820787 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69c9cc7857-lwqxr"] Apr 16 10:07:17.823660 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:07:17.823636 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b799a1d_5b70_47ff_b95d_e1cc88088ee8.slice/crio-683747f1e0600d0c2427659c7a8fc57f21b765c90e9a359c800d91149d0e1626 WatchSource:0}: Error finding container 683747f1e0600d0c2427659c7a8fc57f21b765c90e9a359c800d91149d0e1626: Status 404 returned error can't find the container with id 683747f1e0600d0c2427659c7a8fc57f21b765c90e9a359c800d91149d0e1626 Apr 16 10:07:18.030166 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.030127 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:07:18.321925 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.321840 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" event={"ID":"ed278555-41a2-4c45-aac6-3d9454782e11","Type":"ContainerStarted","Data":"43f9a4faf28895a29d45e8a3efd200042e4b034de4d2ae26d5f896b2c01b6114"} Apr 16 10:07:18.322460 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.322435 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:07:18.323225 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.323201 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d5df4c6cb-fm7kw" Apr 16 10:07:18.323749 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.323723 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4hw98" event={"ID":"f8ed2d4a-7837-47c0-aa78-9cb656214b23","Type":"ContainerStarted","Data":"4d33414496ac2b2e94bac4890a720861bba2f6088429659ffcbb0b647e6a2dc1"} Apr 16 10:07:18.325053 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.325019 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dwt6x" event={"ID":"9007958c-4532-4e2d-a3b7-94207d6d562a","Type":"ContainerStarted","Data":"62eb26e5e238541cd6c0aa84fab522b9b51d42181abe7a44f7f977ae18c8286c"} Apr 16 10:07:18.326539 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.326478 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" event={"ID":"4b799a1d-5b70-47ff-b95d-e1cc88088ee8","Type":"ContainerStarted","Data":"24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1"} Apr 16 10:07:18.326539 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.326508 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" event={"ID":"4b799a1d-5b70-47ff-b95d-e1cc88088ee8","Type":"ContainerStarted","Data":"683747f1e0600d0c2427659c7a8fc57f21b765c90e9a359c800d91149d0e1626"} Apr 16 10:07:18.326687 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.326647 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:07:18.327841 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.327816 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" event={"ID":"b74f7a44-b523-4723-8224-aa8bc5a5759c","Type":"ContainerStarted","Data":"62ae6ad0908c4066b5dc4ae2c675beb1510822dcf9c02fd22a734eeb49a9326d"} Apr 16 10:07:18.329895 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.329835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76864b55b9-5wp8p" event={"ID":"e2919cf8-ac8b-4c58-be68-19e8c41e0c82","Type":"ContainerStarted","Data":"d00cb8b58bf5cca3a928a23936c0eb7f5ebde7ed650e79a6389327be4809a054"} Apr 16 10:07:18.356172 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:18.356121 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" podStartSLOduration=162.35610956 podStartE2EDuration="2m42.35610956s" podCreationTimestamp="2026-04-16 10:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:07:18.355713884 +0000 UTC m=+163.136123094" watchObservedRunningTime="2026-04-16 10:07:18.35610956 +0000 UTC m=+163.136518747" Apr 16 10:07:20.335898 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:20.335866 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4hw98" event={"ID":"f8ed2d4a-7837-47c0-aa78-9cb656214b23","Type":"ContainerStarted","Data":"2c8fd595106463fd23124a70fca9d5300aae57e69d4ee787841d777813b2fba4"} Apr 16 10:07:20.335898 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:20.335901 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4hw98" event={"ID":"f8ed2d4a-7837-47c0-aa78-9cb656214b23","Type":"ContainerStarted","Data":"04adb25fcff963633ab53a949a1989de4a5b2428e378b472fa1742a129cd3587"} Apr 16 10:07:20.336359 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:20.335982 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4hw98" Apr 16 10:07:20.337165 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:20.337146 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dwt6x" event={"ID":"9007958c-4532-4e2d-a3b7-94207d6d562a","Type":"ContainerStarted","Data":"6a200458e4c5e2c6fe151e25871b942a1cbb21d7b729d731208c39f1e08f6c1e"} Apr 16 10:07:20.338329 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:20.338308 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" event={"ID":"b74f7a44-b523-4723-8224-aa8bc5a5759c","Type":"ContainerStarted","Data":"da8b8f410265018485ac4b2887b656bd8d83e24a6049aa602f59d23b97d93dc0"} Apr 16 10:07:20.355699 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:20.355657 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4hw98" podStartSLOduration=129.232489442 podStartE2EDuration="2m11.355645843s" podCreationTimestamp="2026-04-16 10:05:09 +0000 UTC" firstStartedPulling="2026-04-16 10:07:17.580959659 +0000 UTC m=+162.361368824" lastFinishedPulling="2026-04-16 10:07:19.704116056 +0000 UTC m=+164.484525225" observedRunningTime="2026-04-16 10:07:20.354961209 +0000 UTC m=+165.135370395" watchObservedRunningTime="2026-04-16 10:07:20.355645843 +0000 UTC m=+165.136055020" Apr 16 10:07:20.368771 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:20.368736 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dwt6x" podStartSLOduration=129.252878149 podStartE2EDuration="2m11.368727491s" podCreationTimestamp="2026-04-16 10:05:09 +0000 UTC" firstStartedPulling="2026-04-16 10:07:17.594591024 +0000 UTC m=+162.375000209" lastFinishedPulling="2026-04-16 10:07:19.710440359 +0000 UTC m=+164.490849551" observedRunningTime="2026-04-16 10:07:20.368628058 +0000 UTC m=+165.149037246" watchObservedRunningTime="2026-04-16 10:07:20.368727491 +0000 UTC m=+165.149136677" Apr 16 10:07:20.385887 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:20.385842 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-qjjn5" podStartSLOduration=163.500961967 podStartE2EDuration="2m45.385829365s" podCreationTimestamp="2026-04-16 10:04:35 +0000 UTC" firstStartedPulling="2026-04-16 10:07:17.820080424 +0000 UTC m=+162.600489593" lastFinishedPulling="2026-04-16 10:07:19.704947825 +0000 UTC m=+164.485356991" observedRunningTime="2026-04-16 10:07:20.385642196 +0000 UTC m=+165.166051383" watchObservedRunningTime="2026-04-16 10:07:20.385829365 +0000 UTC m=+165.166238553" Apr 16 10:07:28.794695 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:28.794605 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:07:30.343078 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.343048 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4hw98" Apr 16 10:07:30.491106 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.491077 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wn2sw"] Apr 16 10:07:30.494697 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.494675 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.497760 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.497736 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 10:07:30.497866 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.497825 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 10:07:30.498304 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.498273 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 10:07:30.498411 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.498328 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 10:07:30.498411 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.498278 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 10:07:30.498411 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.498358 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 10:07:30.498595 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.498413 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7m7d8\"" Apr 16 10:07:30.653076 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.652998 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-textfile\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.653076 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.653044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d688b17-07fb-4393-86fa-0a470c907047-metrics-client-ca\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.653076 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.653063 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhvqw\" (UniqueName: \"kubernetes.io/projected/8d688b17-07fb-4393-86fa-0a470c907047-kube-api-access-dhvqw\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.653267 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.653083 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-tls\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.653267 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.653117 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-root\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.653267 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.653141 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-sys\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.653267 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.653176 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-wtmp\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.653267 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.653193 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-accelerators-collector-config\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.653267 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.653238 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754484 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754650 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-textfile\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754650 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d688b17-07fb-4393-86fa-0a470c907047-metrics-client-ca\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754650 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhvqw\" (UniqueName: \"kubernetes.io/projected/8d688b17-07fb-4393-86fa-0a470c907047-kube-api-access-dhvqw\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754650 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-tls\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754650 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754614 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-root\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754650 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-sys\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-wtmp\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-root\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754731 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-sys\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754709 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-accelerators-collector-config\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754831 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-wtmp\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.754909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.754876 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-textfile\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.755285 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.755254 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d688b17-07fb-4393-86fa-0a470c907047-metrics-client-ca\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.755285 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.755268 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-accelerators-collector-config\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.756927 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.756910 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.757011 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.756999 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8d688b17-07fb-4393-86fa-0a470c907047-node-exporter-tls\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.764947 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.764927 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhvqw\" (UniqueName: \"kubernetes.io/projected/8d688b17-07fb-4393-86fa-0a470c907047-kube-api-access-dhvqw\") pod \"node-exporter-wn2sw\" (UID: \"8d688b17-07fb-4393-86fa-0a470c907047\") " pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.803707 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:30.803686 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wn2sw" Apr 16 10:07:30.814105 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:07:30.814082 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d688b17_07fb_4393_86fa_0a470c907047.slice/crio-b2d36bd96ac1ac0046a5df405b474e121a178edbf4274b0493057687295b4c53 WatchSource:0}: Error finding container b2d36bd96ac1ac0046a5df405b474e121a178edbf4274b0493057687295b4c53: Status 404 returned error can't find the container with id b2d36bd96ac1ac0046a5df405b474e121a178edbf4274b0493057687295b4c53 Apr 16 10:07:31.366310 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:31.366264 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wn2sw" event={"ID":"8d688b17-07fb-4393-86fa-0a470c907047","Type":"ContainerStarted","Data":"b2d36bd96ac1ac0046a5df405b474e121a178edbf4274b0493057687295b4c53"} Apr 16 10:07:32.370038 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:32.369998 2570 generic.go:358] "Generic (PLEG): container finished" podID="8d688b17-07fb-4393-86fa-0a470c907047" containerID="7e6f492f85988a3c5af0a757589c1ddc6641e5e03ec4d419a973d636ed4b3271" exitCode=0 Apr 16 10:07:32.370407 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:32.370076 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wn2sw" event={"ID":"8d688b17-07fb-4393-86fa-0a470c907047","Type":"ContainerDied","Data":"7e6f492f85988a3c5af0a757589c1ddc6641e5e03ec4d419a973d636ed4b3271"} Apr 16 10:07:33.374151 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:33.374116 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wn2sw" event={"ID":"8d688b17-07fb-4393-86fa-0a470c907047","Type":"ContainerStarted","Data":"1c15f90833f48963635b9cf63735a7164fe338beaa6f11dfd5166251910defec"} Apr 16 10:07:33.374151 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:33.374151 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wn2sw" event={"ID":"8d688b17-07fb-4393-86fa-0a470c907047","Type":"ContainerStarted","Data":"39bd55d27ab53b28ef6ee345d10bda5649e62bf362be23e467049da3b2b57c3d"} Apr 16 10:07:33.395620 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:33.395574 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wn2sw" podStartSLOduration=2.62851604 podStartE2EDuration="3.395559489s" podCreationTimestamp="2026-04-16 10:07:30 +0000 UTC" firstStartedPulling="2026-04-16 10:07:30.815705508 +0000 UTC m=+175.596114673" lastFinishedPulling="2026-04-16 10:07:31.582748957 +0000 UTC m=+176.363158122" observedRunningTime="2026-04-16 10:07:33.394348888 +0000 UTC m=+178.174758099" watchObservedRunningTime="2026-04-16 10:07:33.395559489 +0000 UTC m=+178.175968711" Apr 16 10:07:36.132028 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:36.131994 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69c9cc7857-lwqxr"] Apr 16 10:07:36.136221 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:36.136194 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:07:39.473771 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:39.473714 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" podUID="498de4dd-364c-48a3-a4a8-9b4d0bce5ce1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 10:07:49.472802 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:49.472764 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" podUID="498de4dd-364c-48a3-a4a8-9b4d0bce5ce1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 10:07:59.472862 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:59.472818 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" podUID="498de4dd-364c-48a3-a4a8-9b4d0bce5ce1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 10:07:59.473315 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:59.472898 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" Apr 16 10:07:59.473511 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:59.473476 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"f6e804b33608cb110a6d9454a7489c10b5242793bbe7f7259fd3ada5b707dfe1"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 10:07:59.473614 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:07:59.473557 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" podUID="498de4dd-364c-48a3-a4a8-9b4d0bce5ce1" containerName="service-proxy" containerID="cri-o://f6e804b33608cb110a6d9454a7489c10b5242793bbe7f7259fd3ada5b707dfe1" gracePeriod=30 Apr 16 10:08:00.443503 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:00.443471 2570 generic.go:358] "Generic (PLEG): container finished" podID="498de4dd-364c-48a3-a4a8-9b4d0bce5ce1" containerID="f6e804b33608cb110a6d9454a7489c10b5242793bbe7f7259fd3ada5b707dfe1" exitCode=2 Apr 16 10:08:00.443697 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:00.443542 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" event={"ID":"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1","Type":"ContainerDied","Data":"f6e804b33608cb110a6d9454a7489c10b5242793bbe7f7259fd3ada5b707dfe1"} Apr 16 10:08:00.443697 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:00.443571 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686b9879fb-cx6k5" event={"ID":"498de4dd-364c-48a3-a4a8-9b4d0bce5ce1","Type":"ContainerStarted","Data":"978facec3af4a372d3facf6fe396babe897158ff864797e78645eaad1047f394"} Apr 16 10:08:01.150889 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.150852 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" podUID="4b799a1d-5b70-47ff-b95d-e1cc88088ee8" containerName="registry" containerID="cri-o://24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1" gracePeriod=30 Apr 16 10:08:01.385122 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.385101 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:08:01.447583 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.447500 2570 generic.go:358] "Generic (PLEG): container finished" podID="4b799a1d-5b70-47ff-b95d-e1cc88088ee8" containerID="24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1" exitCode=0 Apr 16 10:08:01.447583 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.447568 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" Apr 16 10:08:01.447729 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.447588 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" event={"ID":"4b799a1d-5b70-47ff-b95d-e1cc88088ee8","Type":"ContainerDied","Data":"24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1"} Apr 16 10:08:01.447729 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.447630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c9cc7857-lwqxr" event={"ID":"4b799a1d-5b70-47ff-b95d-e1cc88088ee8","Type":"ContainerDied","Data":"683747f1e0600d0c2427659c7a8fc57f21b765c90e9a359c800d91149d0e1626"} Apr 16 10:08:01.447729 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.447645 2570 scope.go:117] "RemoveContainer" containerID="24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1" Apr 16 10:08:01.455468 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.455453 2570 scope.go:117] "RemoveContainer" containerID="24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1" Apr 16 10:08:01.455725 ip-10-0-143-196 kubenswrapper[2570]: E0416 10:08:01.455707 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1\": container with ID starting with 24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1 not found: ID does not exist" containerID="24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1" Apr 16 10:08:01.455790 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.455737 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1"} err="failed to get container status \"24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1\": rpc error: code = NotFound desc = could not find container \"24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1\": container with ID starting with 24db2605bdf85497bc415e49b6473244497b3cb283c7d132abd41959952547a1 not found: ID does not exist" Apr 16 10:08:01.483004 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.482983 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-image-registry-private-configuration\") pod \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " Apr 16 10:08:01.483068 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483017 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh4wc\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-kube-api-access-fh4wc\") pod \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " Apr 16 10:08:01.483068 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483044 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-trusted-ca\") pod \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " Apr 16 10:08:01.483141 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483075 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") pod \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " Apr 16 10:08:01.483141 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483097 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-ca-trust-extracted\") pod \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " Apr 16 10:08:01.483141 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483117 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-bound-sa-token\") pod \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " Apr 16 10:08:01.483291 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483146 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-installation-pull-secrets\") pod \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " Apr 16 10:08:01.483291 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483176 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-certificates\") pod \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\" (UID: \"4b799a1d-5b70-47ff-b95d-e1cc88088ee8\") " Apr 16 10:08:01.485154 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483580 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4b799a1d-5b70-47ff-b95d-e1cc88088ee8" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:01.485154 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.483892 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4b799a1d-5b70-47ff-b95d-e1cc88088ee8" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 10:08:01.485495 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.485469 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4b799a1d-5b70-47ff-b95d-e1cc88088ee8" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:08:01.485666 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.485549 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-kube-api-access-fh4wc" (OuterVolumeSpecName: "kube-api-access-fh4wc") pod "4b799a1d-5b70-47ff-b95d-e1cc88088ee8" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8"). InnerVolumeSpecName "kube-api-access-fh4wc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:08:01.485918 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.485893 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4b799a1d-5b70-47ff-b95d-e1cc88088ee8" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:01.486010 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.485958 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4b799a1d-5b70-47ff-b95d-e1cc88088ee8" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 10:08:01.486010 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.485992 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4b799a1d-5b70-47ff-b95d-e1cc88088ee8" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 10:08:01.492503 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.492481 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4b799a1d-5b70-47ff-b95d-e1cc88088ee8" (UID: "4b799a1d-5b70-47ff-b95d-e1cc88088ee8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 10:08:01.584140 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.584097 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-tls\") on node \"ip-10-0-143-196.ec2.internal\" DevicePath \"\"" Apr 16 10:08:01.584140 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.584134 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-ca-trust-extracted\") on node \"ip-10-0-143-196.ec2.internal\" DevicePath \"\"" Apr 16 10:08:01.584140 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.584143 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-bound-sa-token\") on node \"ip-10-0-143-196.ec2.internal\" DevicePath \"\"" Apr 16 10:08:01.584356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.584154 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-installation-pull-secrets\") on node \"ip-10-0-143-196.ec2.internal\" DevicePath \"\"" Apr 16 10:08:01.584356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.584163 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-registry-certificates\") on node \"ip-10-0-143-196.ec2.internal\" DevicePath \"\"" Apr 16 10:08:01.584356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.584173 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-image-registry-private-configuration\") on node \"ip-10-0-143-196.ec2.internal\" DevicePath \"\"" Apr 16 10:08:01.584356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.584184 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fh4wc\" (UniqueName: \"kubernetes.io/projected/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-kube-api-access-fh4wc\") on node \"ip-10-0-143-196.ec2.internal\" DevicePath \"\"" Apr 16 10:08:01.584356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.584193 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b799a1d-5b70-47ff-b95d-e1cc88088ee8-trusted-ca\") on node \"ip-10-0-143-196.ec2.internal\" DevicePath \"\"" Apr 16 10:08:01.774374 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.774332 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69c9cc7857-lwqxr"] Apr 16 10:08:01.777909 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.777886 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69c9cc7857-lwqxr"] Apr 16 10:08:01.799192 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:01.799166 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b799a1d-5b70-47ff-b95d-e1cc88088ee8" path="/var/lib/kubelet/pods/4b799a1d-5b70-47ff-b95d-e1cc88088ee8/volumes" Apr 16 10:08:06.895585 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:06.895555 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dwt6x_9007958c-4532-4e2d-a3b7-94207d6d562a/serve-healthcheck-canary/0.log" Apr 16 10:08:47.624433 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:47.624378 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:08:47.626660 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:47.626638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bccbed26-7fad-44dd-b120-4fe2758154e5-metrics-certs\") pod \"network-metrics-daemon-9mqb2\" (UID: \"bccbed26-7fad-44dd-b120-4fe2758154e5\") " pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:08:47.697904 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:47.697876 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-btvgh\"" Apr 16 10:08:47.705910 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:47.705891 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9mqb2" Apr 16 10:08:47.819870 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:47.819847 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9mqb2"] Apr 16 10:08:47.822268 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:08:47.822231 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbccbed26_7fad_44dd_b120_4fe2758154e5.slice/crio-de70a334fe6c9b2c4d411741ee7b6a5ada3fd3cd15e6f07f4e111515785f541c WatchSource:0}: Error finding container de70a334fe6c9b2c4d411741ee7b6a5ada3fd3cd15e6f07f4e111515785f541c: Status 404 returned error can't find the container with id de70a334fe6c9b2c4d411741ee7b6a5ada3fd3cd15e6f07f4e111515785f541c Apr 16 10:08:48.569348 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:48.569308 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9mqb2" event={"ID":"bccbed26-7fad-44dd-b120-4fe2758154e5","Type":"ContainerStarted","Data":"de70a334fe6c9b2c4d411741ee7b6a5ada3fd3cd15e6f07f4e111515785f541c"} Apr 16 10:08:49.574177 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:49.574147 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9mqb2" event={"ID":"bccbed26-7fad-44dd-b120-4fe2758154e5","Type":"ContainerStarted","Data":"2ce855c77aa5ba7249b7c603c10eee5b59c49eced1d8def198e5016f285f396f"} Apr 16 10:08:49.574177 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:08:49.574181 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9mqb2" event={"ID":"bccbed26-7fad-44dd-b120-4fe2758154e5","Type":"ContainerStarted","Data":"d960bc40ab0d1f4bb21c1fcdc7514a7faa72e4c4e1a96908288b2a04e2b844e3"} Apr 16 10:09:35.695421 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:09:35.695393 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:09:35.695421 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:09:35.695423 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:09:35.700436 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:09:35.700401 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 10:10:25.066890 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.066781 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9mqb2" podStartSLOduration=349.134829981 podStartE2EDuration="5m50.066765255s" podCreationTimestamp="2026-04-16 10:04:35 +0000 UTC" firstStartedPulling="2026-04-16 10:08:47.82398189 +0000 UTC m=+252.604391056" lastFinishedPulling="2026-04-16 10:08:48.75591716 +0000 UTC m=+253.536326330" observedRunningTime="2026-04-16 10:08:49.594123588 +0000 UTC m=+254.374532776" watchObservedRunningTime="2026-04-16 10:10:25.066765255 +0000 UTC m=+349.847174498" Apr 16 10:10:25.067858 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.067835 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-6654h"] Apr 16 10:10:25.068115 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.068099 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b799a1d-5b70-47ff-b95d-e1cc88088ee8" containerName="registry" Apr 16 10:10:25.068186 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.068117 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b799a1d-5b70-47ff-b95d-e1cc88088ee8" containerName="registry" Apr 16 10:10:25.068239 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.068212 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b799a1d-5b70-47ff-b95d-e1cc88088ee8" containerName="registry" Apr 16 10:10:25.070789 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.070772 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:25.073067 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.073042 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 10:10:25.073161 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.073078 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-pgpmw\"" Apr 16 10:10:25.073761 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.073741 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 10:10:25.078897 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.078876 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-6654h"] Apr 16 10:10:25.144643 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.144612 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwdbm\" (UniqueName: \"kubernetes.io/projected/2b58b17d-36f6-4486-8532-bd0aa72e9e68-kube-api-access-qwdbm\") pod \"cert-manager-webhook-597b96b99b-6654h\" (UID: \"2b58b17d-36f6-4486-8532-bd0aa72e9e68\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:25.144787 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.144650 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b58b17d-36f6-4486-8532-bd0aa72e9e68-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-6654h\" (UID: \"2b58b17d-36f6-4486-8532-bd0aa72e9e68\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:25.245608 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.245576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b58b17d-36f6-4486-8532-bd0aa72e9e68-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-6654h\" (UID: \"2b58b17d-36f6-4486-8532-bd0aa72e9e68\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:25.245743 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.245654 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwdbm\" (UniqueName: \"kubernetes.io/projected/2b58b17d-36f6-4486-8532-bd0aa72e9e68-kube-api-access-qwdbm\") pod \"cert-manager-webhook-597b96b99b-6654h\" (UID: \"2b58b17d-36f6-4486-8532-bd0aa72e9e68\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:25.253258 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.253229 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2b58b17d-36f6-4486-8532-bd0aa72e9e68-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-6654h\" (UID: \"2b58b17d-36f6-4486-8532-bd0aa72e9e68\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:25.253362 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.253342 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwdbm\" (UniqueName: \"kubernetes.io/projected/2b58b17d-36f6-4486-8532-bd0aa72e9e68-kube-api-access-qwdbm\") pod \"cert-manager-webhook-597b96b99b-6654h\" (UID: \"2b58b17d-36f6-4486-8532-bd0aa72e9e68\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:25.384573 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.384462 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:25.501767 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.501733 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-6654h"] Apr 16 10:10:25.505042 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:10:25.505009 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b58b17d_36f6_4486_8532_bd0aa72e9e68.slice/crio-126daefdb7e3468eb5ba38c5d726dadb16efb7ffac44ca252965f2e1ce1eb7c1 WatchSource:0}: Error finding container 126daefdb7e3468eb5ba38c5d726dadb16efb7ffac44ca252965f2e1ce1eb7c1: Status 404 returned error can't find the container with id 126daefdb7e3468eb5ba38c5d726dadb16efb7ffac44ca252965f2e1ce1eb7c1 Apr 16 10:10:25.506835 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.506821 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:10:25.817141 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:25.817105 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" event={"ID":"2b58b17d-36f6-4486-8532-bd0aa72e9e68","Type":"ContainerStarted","Data":"126daefdb7e3468eb5ba38c5d726dadb16efb7ffac44ca252965f2e1ce1eb7c1"} Apr 16 10:10:28.826546 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:28.826441 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" event={"ID":"2b58b17d-36f6-4486-8532-bd0aa72e9e68","Type":"ContainerStarted","Data":"419f01577a8a3e6b4687f45c38b3ef7749cd94d5d33a1690ad901bbded1de8d1"} Apr 16 10:10:28.826546 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:28.826521 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:10:28.842413 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:28.842371 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" podStartSLOduration=0.806912619 podStartE2EDuration="3.842359843s" podCreationTimestamp="2026-04-16 10:10:25 +0000 UTC" firstStartedPulling="2026-04-16 10:10:25.506946188 +0000 UTC m=+350.287355352" lastFinishedPulling="2026-04-16 10:10:28.542393397 +0000 UTC m=+353.322802576" observedRunningTime="2026-04-16 10:10:28.840745184 +0000 UTC m=+353.621154370" watchObservedRunningTime="2026-04-16 10:10:28.842359843 +0000 UTC m=+353.622769029" Apr 16 10:10:34.830861 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:10:34.830831 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-6654h" Apr 16 10:11:26.226054 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.226019 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb"] Apr 16 10:11:26.229070 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.229054 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.231123 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.231101 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 16 10:11:26.231549 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.231516 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 10:11:26.232051 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.232037 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-s2vxl\"" Apr 16 10:11:26.232119 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.232057 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 16 10:11:26.232119 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.232103 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 10:11:26.237510 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.237488 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb"] Apr 16 10:11:26.370434 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.370399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc25cc55-a183-44df-81b1-ed2930014d85-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.370626 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.370461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/fc25cc55-a183-44df-81b1-ed2930014d85-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.370626 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.370523 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtzxz\" (UniqueName: \"kubernetes.io/projected/fc25cc55-a183-44df-81b1-ed2930014d85-kube-api-access-jtzxz\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.471111 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.471074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc25cc55-a183-44df-81b1-ed2930014d85-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.471279 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.471132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/fc25cc55-a183-44df-81b1-ed2930014d85-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.471279 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.471165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtzxz\" (UniqueName: \"kubernetes.io/projected/fc25cc55-a183-44df-81b1-ed2930014d85-kube-api-access-jtzxz\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.471776 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.471757 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/fc25cc55-a183-44df-81b1-ed2930014d85-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.473416 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.473397 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc25cc55-a183-44df-81b1-ed2930014d85-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.478784 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.478725 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtzxz\" (UniqueName: \"kubernetes.io/projected/fc25cc55-a183-44df-81b1-ed2930014d85-kube-api-access-jtzxz\") pod \"kubeflow-trainer-controller-manager-55f5694779-9slgb\" (UID: \"fc25cc55-a183-44df-81b1-ed2930014d85\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.538576 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.538552 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:26.652700 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.652601 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb"] Apr 16 10:11:26.655195 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:11:26.655164 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc25cc55_a183_44df_81b1_ed2930014d85.slice/crio-867a27b66e390fff7e4485ea55afa281ce66c6f074dfed14c7a9bba60750b97d WatchSource:0}: Error finding container 867a27b66e390fff7e4485ea55afa281ce66c6f074dfed14c7a9bba60750b97d: Status 404 returned error can't find the container with id 867a27b66e390fff7e4485ea55afa281ce66c6f074dfed14c7a9bba60750b97d Apr 16 10:11:26.972809 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:26.972737 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" event={"ID":"fc25cc55-a183-44df-81b1-ed2930014d85","Type":"ContainerStarted","Data":"867a27b66e390fff7e4485ea55afa281ce66c6f074dfed14c7a9bba60750b97d"} Apr 16 10:11:28.981637 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:28.981606 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" event={"ID":"fc25cc55-a183-44df-81b1-ed2930014d85","Type":"ContainerStarted","Data":"20128e6daf393da996e962093e3745ae9bfc4aa42ea1886bdf1656d1fa58672f"} Apr 16 10:11:28.982062 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:28.981735 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:11:28.997835 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:28.997780 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" podStartSLOduration=0.814779568 podStartE2EDuration="2.997763333s" podCreationTimestamp="2026-04-16 10:11:26 +0000 UTC" firstStartedPulling="2026-04-16 10:11:26.657087678 +0000 UTC m=+411.437496843" lastFinishedPulling="2026-04-16 10:11:28.840071433 +0000 UTC m=+413.620480608" observedRunningTime="2026-04-16 10:11:28.996231662 +0000 UTC m=+413.776640850" watchObservedRunningTime="2026-04-16 10:11:28.997763333 +0000 UTC m=+413.778172521" Apr 16 10:11:44.988545 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:11:44.988497 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-9slgb" Apr 16 10:14:35.717851 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:14:35.717821 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:14:35.718397 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:14:35.717821 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:19:35.738009 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:19:35.737931 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:19:35.738573 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:19:35.738121 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:24:35.757839 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:24:35.757800 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:24:35.759328 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:24:35.758249 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:29:35.775965 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:29:35.775940 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:29:35.778515 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:29:35.778490 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:34:35.793439 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:34:35.793307 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:34:35.797372 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:34:35.795231 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:39:35.809247 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:39:35.809125 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:39:35.812552 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:39:35.812517 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:44:35.824956 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:44:35.824849 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:44:35.829432 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:44:35.829412 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:49:35.841222 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:49:35.841117 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:49:35.845804 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:49:35.845786 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:54:18.270851 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:18.270762 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-9slgb_fc25cc55-a183-44df-81b1-ed2930014d85/manager/0.log" Apr 16 10:54:18.749518 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:18.749441 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-9slgb_fc25cc55-a183-44df-81b1-ed2930014d85/manager/0.log" Apr 16 10:54:19.465643 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:19.465607 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-9slgb_fc25cc55-a183-44df-81b1-ed2930014d85/manager/0.log" Apr 16 10:54:35.863759 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:35.863663 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:54:35.867179 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:35.867161 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:54:59.229488 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.229453 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d9gfw/must-gather-hjldf"] Apr 16 10:54:59.231466 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.231451 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d9gfw/must-gather-hjldf" Apr 16 10:54:59.234287 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.234261 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d9gfw\"/\"kube-root-ca.crt\"" Apr 16 10:54:59.234872 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.234851 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d9gfw\"/\"default-dockercfg-glr59\"" Apr 16 10:54:59.234968 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.234922 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d9gfw\"/\"openshift-service-ca.crt\"" Apr 16 10:54:59.240321 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.240298 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d9gfw/must-gather-hjldf"] Apr 16 10:54:59.338714 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.338685 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/468fa0c9-fcd7-4bd0-8495-12b1ef6be95a-must-gather-output\") pod \"must-gather-hjldf\" (UID: \"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a\") " pod="openshift-must-gather-d9gfw/must-gather-hjldf" Apr 16 10:54:59.338877 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.338726 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kzdw\" (UniqueName: \"kubernetes.io/projected/468fa0c9-fcd7-4bd0-8495-12b1ef6be95a-kube-api-access-7kzdw\") pod \"must-gather-hjldf\" (UID: \"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a\") " pod="openshift-must-gather-d9gfw/must-gather-hjldf" Apr 16 10:54:59.439594 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.439565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/468fa0c9-fcd7-4bd0-8495-12b1ef6be95a-must-gather-output\") pod \"must-gather-hjldf\" (UID: \"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a\") " pod="openshift-must-gather-d9gfw/must-gather-hjldf" Apr 16 10:54:59.439741 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.439602 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kzdw\" (UniqueName: \"kubernetes.io/projected/468fa0c9-fcd7-4bd0-8495-12b1ef6be95a-kube-api-access-7kzdw\") pod \"must-gather-hjldf\" (UID: \"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a\") " pod="openshift-must-gather-d9gfw/must-gather-hjldf" Apr 16 10:54:59.439890 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.439871 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/468fa0c9-fcd7-4bd0-8495-12b1ef6be95a-must-gather-output\") pod \"must-gather-hjldf\" (UID: \"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a\") " pod="openshift-must-gather-d9gfw/must-gather-hjldf" Apr 16 10:54:59.447887 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.447871 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kzdw\" (UniqueName: \"kubernetes.io/projected/468fa0c9-fcd7-4bd0-8495-12b1ef6be95a-kube-api-access-7kzdw\") pod \"must-gather-hjldf\" (UID: \"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a\") " pod="openshift-must-gather-d9gfw/must-gather-hjldf" Apr 16 10:54:59.540275 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.540243 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d9gfw/must-gather-hjldf" Apr 16 10:54:59.665874 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.665849 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d9gfw/must-gather-hjldf"] Apr 16 10:54:59.668074 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:54:59.668045 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod468fa0c9_fcd7_4bd0_8495_12b1ef6be95a.slice/crio-9a9f18e61a11d7ccf7b99b9dbbcabeea15c831a9afd6532a205fafa5bc593bfe WatchSource:0}: Error finding container 9a9f18e61a11d7ccf7b99b9dbbcabeea15c831a9afd6532a205fafa5bc593bfe: Status 404 returned error can't find the container with id 9a9f18e61a11d7ccf7b99b9dbbcabeea15c831a9afd6532a205fafa5bc593bfe Apr 16 10:54:59.669839 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.669823 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 10:54:59.697904 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:54:59.697874 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/must-gather-hjldf" event={"ID":"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a","Type":"ContainerStarted","Data":"9a9f18e61a11d7ccf7b99b9dbbcabeea15c831a9afd6532a205fafa5bc593bfe"} Apr 16 10:55:00.703515 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:00.703426 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/must-gather-hjldf" event={"ID":"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a","Type":"ContainerStarted","Data":"5eaa11b18374fc3da238a31dfa206954dd12b249c09ec59503b2aeba27cda895"} Apr 16 10:55:00.703515 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:00.703474 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/must-gather-hjldf" event={"ID":"468fa0c9-fcd7-4bd0-8495-12b1ef6be95a","Type":"ContainerStarted","Data":"bbff565550d3baf3d5675d2ddb424436d070daf60eaf7c2c523083c15cf854a3"} Apr 16 10:55:01.972417 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:01.972386 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-znlvq_e9fc47e9-a934-4e31-b64d-70aef48ffd3e/global-pull-secret-syncer/0.log" Apr 16 10:55:02.108463 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:02.108430 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-s7wkr_ee8a5c7c-8f19-49e3-8fed-4d2a3920d7cf/konnectivity-agent/0.log" Apr 16 10:55:02.191315 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:02.191283 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-196.ec2.internal_9c721cdecb4375340a2fbe75779b609c/haproxy/0.log" Apr 16 10:55:06.126356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:06.126327 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wn2sw_8d688b17-07fb-4393-86fa-0a470c907047/node-exporter/0.log" Apr 16 10:55:06.190644 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:06.190620 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wn2sw_8d688b17-07fb-4393-86fa-0a470c907047/kube-rbac-proxy/0.log" Apr 16 10:55:06.229640 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:06.229612 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wn2sw_8d688b17-07fb-4393-86fa-0a470c907047/init-textfile/0.log" Apr 16 10:55:08.045599 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.045561 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-qjjn5_b74f7a44-b523-4723-8224-aa8bc5a5759c/networking-console-plugin/0.log" Apr 16 10:55:08.193891 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.193824 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d9gfw/must-gather-hjldf" podStartSLOduration=8.505346739 podStartE2EDuration="9.193802775s" podCreationTimestamp="2026-04-16 10:54:59 +0000 UTC" firstStartedPulling="2026-04-16 10:54:59.669945978 +0000 UTC m=+3024.450355146" lastFinishedPulling="2026-04-16 10:55:00.358402014 +0000 UTC m=+3025.138811182" observedRunningTime="2026-04-16 10:55:00.727653508 +0000 UTC m=+3025.508062709" watchObservedRunningTime="2026-04-16 10:55:08.193802775 +0000 UTC m=+3032.974211963" Apr 16 10:55:08.194322 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.194305 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn"] Apr 16 10:55:08.197288 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.197271 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.205210 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.205179 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-lib-modules\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.205338 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.205247 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-podres\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.205338 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.205281 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7f6n\" (UniqueName: \"kubernetes.io/projected/649c53aa-cbc7-4691-9f93-1043411300a1-kube-api-access-v7f6n\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.205338 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.205307 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-sys\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.205338 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.205329 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-proc\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.205899 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.205878 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn"] Apr 16 10:55:08.305804 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-podres\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.305804 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305775 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7f6n\" (UniqueName: \"kubernetes.io/projected/649c53aa-cbc7-4691-9f93-1043411300a1-kube-api-access-v7f6n\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.305804 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-sys\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.306092 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-proc\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.306092 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305869 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-podres\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.306092 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305884 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-lib-modules\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.306092 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-sys\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.306092 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-proc\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.306092 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.305970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/649c53aa-cbc7-4691-9f93-1043411300a1-lib-modules\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.315293 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.315267 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7f6n\" (UniqueName: \"kubernetes.io/projected/649c53aa-cbc7-4691-9f93-1043411300a1-kube-api-access-v7f6n\") pod \"perf-node-gather-daemonset-846sn\" (UID: \"649c53aa-cbc7-4691-9f93-1043411300a1\") " pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.507952 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.507918 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:08.646576 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.646417 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn"] Apr 16 10:55:08.649372 ip-10-0-143-196 kubenswrapper[2570]: W0416 10:55:08.649346 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod649c53aa_cbc7_4691_9f93_1043411300a1.slice/crio-5f5d5b01c61cd2103594c225056cab853a7d46fc77fca5518fbc70ffcdfedc53 WatchSource:0}: Error finding container 5f5d5b01c61cd2103594c225056cab853a7d46fc77fca5518fbc70ffcdfedc53: Status 404 returned error can't find the container with id 5f5d5b01c61cd2103594c225056cab853a7d46fc77fca5518fbc70ffcdfedc53 Apr 16 10:55:08.730393 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:08.730360 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" event={"ID":"649c53aa-cbc7-4691-9f93-1043411300a1","Type":"ContainerStarted","Data":"5f5d5b01c61cd2103594c225056cab853a7d46fc77fca5518fbc70ffcdfedc53"} Apr 16 10:55:09.734866 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:09.734830 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" event={"ID":"649c53aa-cbc7-4691-9f93-1043411300a1","Type":"ContainerStarted","Data":"238f6bb3dfaed0dfbcb92ea293369b01c01e046eb6b6eda96601aac2b3b7e7b9"} Apr 16 10:55:09.735246 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:09.734943 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:09.752152 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:09.752103 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" podStartSLOduration=1.752087782 podStartE2EDuration="1.752087782s" podCreationTimestamp="2026-04-16 10:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 10:55:09.751023782 +0000 UTC m=+3034.531432965" watchObservedRunningTime="2026-04-16 10:55:09.752087782 +0000 UTC m=+3034.532496970" Apr 16 10:55:09.898832 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:09.898801 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4hw98_f8ed2d4a-7837-47c0-aa78-9cb656214b23/dns/0.log" Apr 16 10:55:09.919629 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:09.919600 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4hw98_f8ed2d4a-7837-47c0-aa78-9cb656214b23/kube-rbac-proxy/0.log" Apr 16 10:55:10.079380 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:10.079354 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rj4q8_fc1a6ffe-e50d-41c1-aa12-db0a4d10232f/dns-node-resolver/0.log" Apr 16 10:55:10.613119 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:10.613094 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v8sh7_4a61fb09-0793-4b69-b34c-784caf0249e5/node-ca/0.log" Apr 16 10:55:11.698665 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:11.698636 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dwt6x_9007958c-4532-4e2d-a3b7-94207d6d562a/serve-healthcheck-canary/0.log" Apr 16 10:55:12.271647 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:12.271620 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xf8wh_120ed0a9-f7ea-4130-a3dd-d983418dbacb/kube-rbac-proxy/0.log" Apr 16 10:55:12.316086 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:12.316055 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xf8wh_120ed0a9-f7ea-4130-a3dd-d983418dbacb/exporter/0.log" Apr 16 10:55:12.353923 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:12.353896 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xf8wh_120ed0a9-f7ea-4130-a3dd-d983418dbacb/extractor/0.log" Apr 16 10:55:15.748168 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:15.748140 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d9gfw/perf-node-gather-daemonset-846sn" Apr 16 10:55:18.854669 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:18.854637 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9smg_191a958c-a1a7-4e33-9456-1f482a72fb5e/kube-multus-additional-cni-plugins/0.log" Apr 16 10:55:18.880683 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:18.880606 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9smg_191a958c-a1a7-4e33-9456-1f482a72fb5e/egress-router-binary-copy/0.log" Apr 16 10:55:18.901508 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:18.901472 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9smg_191a958c-a1a7-4e33-9456-1f482a72fb5e/cni-plugins/0.log" Apr 16 10:55:18.923930 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:18.923903 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9smg_191a958c-a1a7-4e33-9456-1f482a72fb5e/bond-cni-plugin/0.log" Apr 16 10:55:18.952201 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:18.952172 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9smg_191a958c-a1a7-4e33-9456-1f482a72fb5e/routeoverride-cni/0.log" Apr 16 10:55:18.977596 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:18.977569 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9smg_191a958c-a1a7-4e33-9456-1f482a72fb5e/whereabouts-cni-bincopy/0.log" Apr 16 10:55:19.000575 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:19.000547 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v9smg_191a958c-a1a7-4e33-9456-1f482a72fb5e/whereabouts-cni/0.log" Apr 16 10:55:19.073452 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:19.073423 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hhvjb_22b85402-76c2-472c-90f0-25a54604bbb9/kube-multus/0.log" Apr 16 10:55:19.200025 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:19.199942 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9mqb2_bccbed26-7fad-44dd-b120-4fe2758154e5/network-metrics-daemon/0.log" Apr 16 10:55:19.233498 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:19.233468 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9mqb2_bccbed26-7fad-44dd-b120-4fe2758154e5/kube-rbac-proxy/0.log" Apr 16 10:55:20.706731 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:20.706699 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-controller/0.log" Apr 16 10:55:20.733082 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:20.733054 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/0.log" Apr 16 10:55:20.762423 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:20.762403 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovn-acl-logging/1.log" Apr 16 10:55:20.789310 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:20.789288 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/kube-rbac-proxy-node/0.log" Apr 16 10:55:20.819500 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:20.819477 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 10:55:20.853577 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:20.853554 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/northd/0.log" Apr 16 10:55:20.883718 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:20.883685 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/nbdb/0.log" Apr 16 10:55:20.913543 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:20.913510 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/sbdb/0.log" Apr 16 10:55:21.111659 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:21.111631 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmhhw_23649a52-557a-477c-838c-84b209078bbb/ovnkube-controller/0.log" Apr 16 10:55:22.085356 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:22.085327 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2h7wk_dba5a2c9-dd08-4be6-a76d-85742ada944e/network-check-target-container/0.log" Apr 16 10:55:22.944685 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:22.944652 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-g9ftj_54582370-009e-4add-bc9e-5a5c18069e72/iptables-alerter/0.log" Apr 16 10:55:23.563292 ip-10-0-143-196 kubenswrapper[2570]: I0416 10:55:23.563259 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8hjnc_9fb1190b-eb18-4a5c-91c6-5bec59d57dc4/tuned/0.log"